Research Papers

The Heinz College has an international reputation for the quality of its research. Our interdisciplinary environment creates exciting opportunities for collaboration and produces a breadth of research work not typically found in schools of our size.

Our faculty and research centers consistently receive funding support from government agencies, foundations and corporate partners. The National Science Foundation; the Heinz Endowments; The Mellon Foundation; the U.S. Departments of Defense, Commerce, Health and Human Services, and Housing and Urban Development; the Sloan Foundation and the National Institute of Justice are some of the many organizations supporting our work.

Doctoral students often collaborate with Heinz College faculty on research projects and receive funding on their own to conduct their research work. The small size of our doctoral program provides significant opportunities for students to engage in meaningful research early in their studies and to have their work published in important journals.
 

TitleAuthorTypeYear
Aggregation Bias

Vibhanshu Abhishek, Kartik Hosanagar and Peter S. Fader. Aggregation Bias in Sponsored Search Data: The Curse and The Cure, forthcoming, Marketing Science

Abhishek, Vibhanshu
Article2014
For a current list of Working Papers

See personal website

Richards-Shubik, Seth
Working Paper2014
Is Marijuana Safer than Alcohol? Insights from Users' Self-Reports

Those who favor legalizing marijuana often argue that marijuana is safer than alcohol. While alcohol is certainly more dangerous in some respects, risk is multi-faceted, and this paper demonstrates that marijuana users report more problems with their use than do alcohol users.

Caulkins, Jonathan
Working Paper2014
Killing the Golden Goose or Just Chasing it Around the Farmyard?: Generic Entry and the Incentives for Early-Stage Pharmaceutical Innovation

Over the last decade, generic penetration in the U.S. pharmaceutical market has increased substantially, providing significant consumer surplus gains. What impact has this rise in generic penetration had on the rate and direction of early stage pharmaceutical innovation? We explore this question using novel data sources and an empirical framework that models the flow of early-stage pharmaceutical innovations as a function of generic penetration, scientific opportunity, firm innovative capability, and additional controls. While the overall aggregative level of drug development activity has remained fairly stable, our estimates suggest a sizable, robust, negative relationship between generic penetration and early-stage pharmaceutical research activity within therapeutic markets. A 10% increase in generic penetration decreases early-stage innovations in the same market by 7.9%. This effect is weaker, but still economically and statistically significant in top therapeutic markets where an increase in generic penetration by 10% decreases the flow of early-stage innovations by 2.1%. Our estimated effects appear to vary across therapeutic classes in sensible ways, reflecting the differing degrees of substitution between generics and branded drugs in treating different diseases. Finally, we are able to document that with increasing generic penetration, firms in our sample are shifting their R&D activity to more biologics-based (large-molecule) products rather than chemicals-based (small-molecule) products as evidenced in their early-stage pipelines. We conclude by discussing the potential implications of our results for long-run consumer welfare, policy, and innovation.

Branstetter, Lee
Chatterjee, Chirantan
Working Paper2014
Regulation and Welfare: Evidence from Paragraph IV Generic Entry in the Pharmaceutical Industry

With increasing frequency, generic drug manufacturers in the U.S. are able to challenge the monopoly status of patent-protected drugs before their patents expire. The legal foundation for these challenges is found in Paragraph IV of the Hatch-Waxman Act. If successful, these challenges generally lead to large market share losses for incumbents and sharp declines in average market prices. This paper estimates, for the first time, the welfare effects of accelerated generic entry via these challenges. Using aggregate brand level sales data for hypertension drugs in the U.S. we estimate demand using both a nested logit model and a random coefficients logit model in order to back out cumulated consumer surplus. We then undertake a counterfactual analysis, removing the stream of Paragraph IV facilitated generic products and recomputing consumer surplus in their absence, in order to estimate the consumer surplus gains associated with Paragraph IV entry. Results based on the more flexible random coefficients logit specification indicate that gains flowing to consumers over 2000-2008 as a result of this regulatory mechanism amount to around $41 billion. These gains come at the expense of producers who lose, approximately, $18 billion. This suggests that net short-term social gains stand at around $23 billion. Estimated net social gains are more than twice as large when we employ a nested logit approach over the same sample period. We also demonstrate significant cross-molecular substitution within the market and discuss the possible appropriation of consumer rents by the insurance industry. Policy and innovation implications are discussed.

Revise and Resubmit from the RAND Journal of Economics.

Branstetter, Lee
Working Paper2014
ROC-Based Model Estimation for Forecasting Large Changes in Demand

Forecasting for large changes in demand should benefit from different estimation than that used for estimating mean behavior. We develop a multivariate forecasting model designed for detecting the largest changes across many time series. The model is fit based upon a penalty function that maximizes true positive rates along a relevant false positive rate range and can be used by managers wishing to take action on a small percentage of products likely to change the most in the next time period. We apply the model to a crime dataset and compare results to OLS as the basis for comparisons as well as models that are promising for exceptional demand forecasting such as quantile regression, synthetic data from a Bayesian model, and a power loss model. Using the Partial Area Under the Curve (PAUC) metric, our results show statistical significance, a 35 percent improvement over OLS, and at least a 20 percent improvement over competing methods. We suggest management with an increasing number of products to use our method for forecasting.

Gorr, Wilpen
Schneider, Matthew
Working Paper2014
Stochastic Frontier Models

 Horrace, William C., Seth Richards-Shubik, and Ian A. Wright. 2014. “Expected Efficiency Ranks from Parametric Stochastic Frontier Models.” Empirical Economics, forthcoming.

Richards-Shubik, Seth
Article2014
The Power of Hydroelectric Dams: Agglomeration Spillovers

 How much of the geographic clustering of economic activity is attributable to agglomeration spillovers as opposed to natural advantages? I present evidence on this question using data on the long-run effects of large scale hydroelectric dams built in the U.S. over the 20th century, obtained through a unique comparison between counties with or without dams but with similar hydropower potential. Until mid-century, the availability of cheap local power from hydroelectric dams conveyed an important advantage that attracted industry and population. By the 1950s, however, these advantages were attenuated by improvements in the efficiency of thermal power generation and the advent of high tension transmission lines. Using a novel combination of synthetic control methods and event-study techniques, I show that, on average, dams built before 1950 had substantial short run effects on local population and employment growth, whereas those built after 1950 had no such effects. Moreover, the impact of pre-1950 dams persisted and continued to grow after the advantages of cheap local hydroelectricity were attenuated, suggesting the presence of important agglomeration spillovers. Over a 50 year horizon, I estimate that at least one half of the long run effect of pre-1950 dams is due to spillovers. The estimated short and long run effects are highly robust to alternative procedures for selecting synthetic controls, to controls for confounding factors such as proximity to transportation networks, and to alternative sample restrictions, such as dropping dams built by the Tennessee Valley Authority or removing control counties with environmental regulations. I also find small local agglomeration effects from smaller dam projects, and small spillovers to nearby locations from large dams. Lastly, I find relatively small costs of environmental regulations associated with hydroelectric licensing rules.

Severnini, Edson
Working Paper2014
Correlates of Homicide: New Space/Time Interaction Tests for Spatiotemporal Point Processes

Statistical inference on spatiotemporal data often proceeds by focusing on the temporal aspect of the data, ignoring space, or the spatial aspect, ignoring time. In this paper, we explicitly focus on the interaction between space and time. Using a geocoded, time-stamped dataset from Chicago of almost 9 millions calls to 911 between 2007 and 2010, we ask whether any of these call types are associated with shootings or homicides. Standard correlation techniques do not produce meaningful results in the spatiotemporal setting because of two confounds: purely spatial effects (i.e. "bad" neighborhoods) and purely temporal effects (i.e. more crimes in the summer) could introduce spurious correlations. To address this issue, a handful of statistical tests for space-time interaction have been proposed, which explicitly control for separable spatial and temporal dependencies. Yet these classical tests each have limitations. We propose a new test for space-time interaction, using a Mercer kernel-based statistic for measuring the distance between probability distributions. We compare our new test to existing tests on simulated and real data, where it performs comparably to or better than the classical tests. For the application we consider, we find a number of interesting and significant space-time interactions between 911 call types and shootings/homicides.

Flaxman, Seth
Neill, Daniel
Smola, Alex
Working Paper2013
Deterrence in the Twenty-first Century: A Review of the Evidence

Deterrence in the Twenty-first Century: A Review of the Evidence
Forthcoming Crime and Justice An Annual Review

Nagin, Daniel
Working Paper2013
Early Warning System for Crime Hot Spots

Using violent crime data from Pittsburgh, Pennsylvania we investigate the performance of an “early warning system” (EWS) for starting/stopping police deployments to hot spots for crime prevention. We show that (1) even the hottest chronic hot spots are dynamic with months “on” and “off” and (2) temporary hot spots are also targets for prevention. We compare the performance of EWS to constant deployment at chronic hot spots.

Gorr, Wilpen
Working Paper2013
Estimating Heterogenous Preferences for Housing with Latent Quality

Estimating Heterogenous Preferences for Housing with Latent Quality

Quintero, Luis
Working Paper2013
Examining the Impact of Ranking on Consumer Behavior and Search Engine Revenue.

 Examining the Impact of Ranking on Consumer Behavior and Search Engine Revenue. 2013. Forthcoming in Management Science. (with Anindya Ghose and Panagiotis G. Ipeirotis)

Li, Beibei
Article2013
For-Profit Earnings

Turner, Abby Clay. 2013. Selection Bias and For-Profit Associate Degrees: Evidence from the NLSY Geocode. Working paper.

Abstract:  The past decade has seen enormous growth in the for-profit higher education industry, and along with it, enormous debate over the relative costs and benefits of such an education.  Utilizing the rich data from the NLSY97 Geocode merged with institutional data from IPEDS, I empirically analyze data on individuals with two-year degrees, estimate the average marginal earnings gain from a two-year degree, and compare the effects of degrees across institutional sector and across major area of study using OLS with family background and extensive demographic controls.  I find evidence of selection at three levels: selection into college, selection into type of college, and selection into major area of study.  Any estimates of labor market returns to these degrees will be biased until future research unravels and models these selection mechanisms and processes.  This chapter provides a first look into the differential inputs and outputs of for-profit and public two-year degree programs.  I find that a two-year degree results in an 8.1 percent average marginal earnings gain over a high-school diploma, and that the sector of the degree-granting institution alone does affect this gain.  I also find that earnings gains vary greatly by major; individuals with “academic” degrees experience no significant earnings gains while individuals with “vocational/technical” degrees on average experience a 32.7 percent earnings gain.  I find statistical differences in the marginal earnings gains across institutional sector within major fields of study, suggesting that attending a for-profit does matter when major field of study is taken into account.  Policy-makers should take note that this preliminary analysis of the returns (which can be thought of as the private benefits) to public and for-profit degrees does not provide unambiguous evidence in favor of one sector over another, but rather a first look into the “black box” in which students, institutions, and major areas of study come together and jointly determine labor market outcomes.  

Turner, Abby
Article2013
Job Market Paper
Turner, Abby Clay. 2013. Labor Strife in Public Schools: Does it Affect Education Production? Job market paper.
 

Abstract  Teacher strikes and the right of public employees to collectively bargain are topics of frequent and heated debate in the public sphere, with little research available to inform the debate.  In firms, the negative relationship between labor unrest and reduced productivity is well-documented; the purpose of this study is to explore whether there exists a similar, measurable relationship between labor strife and productivity in public schools.  Using regression analysis to analyze data that includes teacher strikes and expired contracts over a seven-year period in Pennsylvania, I find that the pass rates on a district-level cohort’s math tests decrease by about 1-2 percentage points in the year of a strike and by about 0.5 percentage points during a year that teachers work under an expired contract.  Additionally, cohorts experiencing a strike during their 11th-grade year realize about a 2 percentage-point decrease in their graduation rate.  In addition to improving upon the methodologies of previous teacher strike papers, this paper distinguishes between productivity loss due to strikes and that due to lengthy ongoing labor disputes that do not necessarily end in strike.  Policy implications include making administrators aware of the possible effect of a strike on graduation rates and the need for better collection of data on collective bargaining by state agencies. 

Turner, Abby
Working Paper2013
Job Market Paper

Turner, Abby Clay. 2013. Labor Strife in Public Schools: Does it Affect Education Production? Job market paper.

Working Paper2013
Modeling and Estimating the Price-Quality Frontier in Markets for Durable Housing

Modeling and Estimating the Price-Quality Frontier in Markets for Durable Housing with Dennis Epple and Holger Sieg, 2013

Quintero, Luis
Working Paper2013
Optimal Bidding

Vibhanshu Abhishek and Kartik Hosanagar. Optimal Bidding in Multi-Item Multi-Slot Sponsored Search Auctions, Operations Research 61(4): 855-873 (2013)

Abhishek, Vibhanshu
Article2013
Surviving Social Media Overload: Predicting Consumer Footprints on Product Search Engines.

Surviving Social Media Overload: Predicting Consumer Footprints on Product Search Engines. 2013. (with Anindya Ghose and Panagiotis G. Ipeirotis)

Li, Beibei
Working Paper2013
The Globalization of R&D: China, India, and the Rise of International Co-invention

Abstract: The rapid rise of China and India as innovating nations seems to contradict conventional views of the economic growth and development process. In standard models, the acquisition of innovative capacity in frontier technologies emerges as one of the final stages in a long development process. China and India are still poor, yet advanced nations are granting rapidly growing numbers of patents to inventors based in these countries. Our analysis of these patents shows that a majority of them are granted to local inventor teams working for foreign multinationals. An important fraction of these patents also incorporate direct intellectual inputs from researchers outside China or India, a trend that we characterize as "international co-invention." As such, the international patenting surge of China and India does not represent a challenge to traditional models of growth and development, so much as it represents a move toward an expanded international division of labor within global R&D networks.

Branstetter, Lee
Li, Guangwei
Veloso, Francisco
Working Paper2013
The Incredible Shrinking Portuguese Firm

Using Portugal's extensive matched employer-employee data set, this paper documents an unusual feature of the Portuguese economy. For decades, the entire Portuguese firm size distribution has been shifting to the left. We argue in this paper that Portugal's shrinking firms are linked to the country's anemic growth and low productivity. We show that the shift in the Portuguese firm size distribution is not reflected in other advanced industrial economies for which we have been able to obtain comparable data. Careful attempts to account for expanding data coverage, a structural shift from manufacturing to services, and aggressive efforts to 'demonopolize' the Portuguese economy leave more than half of this shift unexplained by these factors. So, what does explain this shift? We argue that Portugal's uniquely strong protections for regular workers have played an important role. Drawing upon an emerging literature that attributes much of the productivity gap between advanced nations and developing nations to the misallocation of resources across firms in developing countries, we develop a theoretical model that shows how Portugal's labor market institutions could prevent more productive firms from reaching their optimal size, thereby constraining GDP per capita. Calibration exercises based on this model quantify the degree of labor market distortion consistent with the recent shifts in the Portuguese firm size distribution. These calibration exercises suggest quite substantial growth effects could arise if the distortions were lessened or abolished altogether. 

Braguinsky, Serguey
Branstetter, Lee
Regateiro, Andre
Working Paper2013
Who to Marry and Where to Live: Estimating a Collective Marriage Market Model

 I study the joint choice of spouse and location made by individuals at the start of their adult lives in the U.S. I assume that potential spouses meet in a marriage market and decide who to marry and where they will live, taking account of varying economic opportunities in different locations and inherent preferences for living near the families of both spouses. I develop a theoretical framework that incorporates a collective model of household allocation, conditional on the choice of spouse and location, with a forward-looking model of the marriage market that allows for the potential inability of spouses to commit to a particular intra-household sharing rule. Estimation results for young dual-career households in the 2000 Census point to three main findings. First, I find excess sensitivity of the sharing rule that governs the allocation of resources among couples to the conditions in the location they actually choose, implying that spouses cannot fully commit to a sharing rule. Second, I show that the lack of commitment has a relatively larger effect on the share of family resources received by women. Third, I find that the failure of full commitment can explain nearly all of the gap in the intra-state migration rates of single and married people in the U.S.

Research In Progress


Severnini, Edson
Working Paper2013
A Stata Plugin for Estimating Group-Based Trajectory Models

 A Stata Plugin for Estimating Group-Based Trajectory Models

Nagin, Daniel
Working Paper2012
Air Pollution, Power Grid, and Infant Health: Evidence from the Shutdown of Nuclear Power Plants in the Tennessee Valley Authority in the 1980s

 When environmental regulations focus on a subset of power plants, the ultimate goal of human health protection may not be reached. Because power plants are interconnected through the electrical grid, excessive scrutiny of a group of facilities may generate more pollution out of another group, with potential deleterious effects to public health. I study the impact of the shutdown of nuclear power plants in the Tennessee Valley Authority (TVA), in the 1980s, on infant health outcomes. After the Three Mile Island accident in 1979, the Nuclear Regulatory Commission (NRC) intensified inspections in nuclear facilities leading to shutdown of many of them, including Browns Ferry and Sequoyah in the TVA area. I first show that, in response to the shutdown, electricity generation shifted mostly to coal-fired power plants within TVA, increasing pollution in counties where they were located. Second, I find that babies born after the shutdown had lower birth weight in those counties with coal-fired power plants. Third, I provide suggestive evidence of heterogeneity in those effects depending on how much more electricity those coal-fired facilities were generating in response to the shutdown. These findings may shed light on some potential consequences of the retirement of the San Onofre Nuclear Plant in California, and the ongoing nuclear power phase out in Germany, greatly intensified after the Japanese Fukushima disaster. 

Severnini, Edson
Working Paper2012
Control Paper

Brandimarte, L., Acquisti, A. and Loewenstein, G. (2012). Misplaced Confidences: Privacy and the Control Paradox. Social Psychological and Personality Science (in press).

Brandimarte, Laura
Article2012
Designing Ranking Systems for Hotels on Travel Search Engines by Mining User-Generated and Crowd-Sourced Content

 Designing Ranking Systems for Hotels on Travel Search Engines by Mining User-Generated and Crowd-Sourced Content. 2012. Marketing Science. (with Anindya Ghose and Panagiotis G. Ipeirotis)

Li, Beibei
Article2012
Deterrence: A Review of the Evidence by a Criminologist for Economists

Deterrence: A Review of the Evidence by a Criminologist for Economists

Nagin, Daniel
Working Paper2012
Discounting

Brandimarte, L., Vosgerau, J. and Acquisti, A. Depreciation of the Past: Diagnostic Behaviors Have a Longer-Lasting Impact Than Non-Diagnostic Behaviors (Working Paper).

Brandimarte, Laura
Working Paper2012
Handbook Chapter

Brandimarte, L. and Acquisti, A. (2012). The Economics of Privacy. In M. Peitz and J. Waldfogel (Eds.), Handbook of the Digital Economy, Oxford University Press.

Brandimarte, Laura
Chapter2012
Induced Innovation

Cutler, David M., Ellen Meara, and Seth Richards-Shubik. 2012. “Induced Innovation and Social Inequality: Evidence from Infant Medical Care.” Journal of Human Resources, 47(2): 456-492.

Richards-Shubik, Seth
Article2012
The Effect of Incarceration on Re-Offending: Evidence from a Natural Experiment in Pennsylvania

The Effect of Incarceration on Re-Offending: Evidence from a Natural Experiment in Pennsylvania

Nagin, Daniel
Working Paper2012
The Effect of Public Policies on Inducing Technological Change in Renewable Energies

This paper evaluates the effect of several public policies on the technological progress of four renewable energies (solar, wind, ocean, and geothermal) in Europe, the U.S. and Japan over the last 30 years.

Vincenzi, Marco
Working Paper2012
a.Monte Carlo

Horrace, William C., and Seth Richards-Shubik. 2012. “A Monte Carlo Study of Ranked Efficiency Estimates from Frontier Models.” Journal of Productivity Analysis, 38(2): 155-165.

Richards-Shubik, Seth
Article2011
Auto-probit Model for Multiple Regimes of Network Effects

Auto-probit Model for Multiple Regimes of Network Effects

Zhang, Bin
Working Paper2011
Estimating App Demand from Publicly Available Data

Garg, Rajiv and Telang, Rahul, "Estimating App Demand from Publicly Available Data" (2012)

ssrn.com/abstract=1924044

Garg, Rajiv
Telang, Rahul
Working Paper2011
Estimating Black-White Mortality Differences by State of Birth Using Census and Vital Statistics Data: A Simple GMM Approach

Estimating Black-White Mortality Differences by State of Birth Using Census and Vital Statistics Data: A Simple GMM Approach

Hsu, Yu-Chieh
Working Paper2011
Innovation, Commercialization and the Successful Startup

 Innovation, Commercialization and the Successful Startup (Working Paper)

Jordan, James
Working Paper2011
Longitudinal Study of Crime Hot Spots

Longitudinal Study of Crime Hot Spots: Dynamics and Impact on Part 1 Violent Crime 

Gorr, Wilpen
Working Paper2011
Short and Long Run Effects of Disease and Pollution: Evidence from the 1918 Influenza Pandemic Interacted with Air Pollution from Fossil-Fuel Power Plants (joint with Karen Clay (CMU) and Joshua Lewis (University of Montreal))

 Urban outdoor air pollution is responsible for 1.3 million premature deaths annually (WHO, 2009). In addition to these direct effects, poor air quality may also be harmful to the immune system, lowering underlying health, and leaving individuals more susceptible to the consequences of a negative health shock. In this study, we examine whether exposure to air pollution exacerbates the effects of a negative health shock. We investigate the interaction effect between the 1918 influenza pandemic and pre-determined levels of air pollution arising from fossil fuel power plants, based on a newly collected data on those plants in the 1910s. We study the short run impact on mortality using county level vital statistics data, the medium run effects on labor market outcomes and educational attainment using county level data computed from the 1940 U.S. Census, and the impact on the long run outcomes studied by Almond (2006) using state level data from the 1960-80 decennial U.S. Census. To the best of our knowledge, this is the first attempt to understand how pollution interacts with health conditions in determining short and long-term outcomes.

Clay, Karen
Severnini, Edson
Working Paper2011
Violence in the “Balance”: A Structural Analysis of How Rivals, Allies, and Third-Parties Shape Inter-Gang Violence

Nakamura, Kiminori, George Tita, and David Krackhardt. 2011. Violence in the “Balance”: A Structural Analysis of How Rivals, Allies, and Third-Parties Shape Inter-Gang Violence. Heinz College Working Paper Series.

Nakamura, Kiminori
Working Paper2011
x.Rise in Educational Gradients in Mortality

Cutler, David M., Fabian Lange, Ellen Meara, Seth Richards-Shubik, and Christopher J. Ruhm. 2011. “Rising Educational Gradients in Mortality: The Role of Behavioral Risk Factors.” Journal of Health Economics, 30(6): 1174-87.

Richards-Shubik, Seth
Article2011
AshwoodPaper2

 Ashwood, J.S.  Predicting Hospital safety: A Comparison of Four Models.  Working Paper

Ashwood, Jefferson
Working Paper2010
Benefits and Costs of Electricity in pre-Clean Air Act United States (joint with Karen Clay (CMU) and Joshua Lewis (University of Montreal))

China and India are opening thermal power plants at an extremely rapid pace. Many have little or no emission controls. This is similar to the United States in the age of electrification, particularly the period from the 1930s-1960s, when transmission lines and economies of scale allowed utilities to build large generation plants outside of cities, and generation plants did nothing to limit their emissions beyond building moderately tall smokestacks. Using county-level data for 1930-1960, this paper investigates the economic gains from local electrification and the pollution costs associated with fossil fuel generation. The empirical analysis combines newly digitized data on power generation and transmission with county-level data on mortality and land values. In the baseline analysis, we relate annual changes in county-level all age and infant mortality rates to the construction of fossil-fuel power plants in a county-fixed effects framework. Local availability dictated the types of fuels used, and fuels varied in their emissions from low emission natural gas to high emission eastern bituminous coal. We employ a difference-in-differences approach to compare counties with power facilities using fossil fuels with different emission levels per megawatt hour. We also compare the net benefits of local thermal power generation to local hydropower generation and expansion of the power grid. This analysis sheds light on historical consequences of electrification in the United States, and is relevant for ongoing electrification policy in developing countries.

Clay, Karen
Severnini, Edson
Working Paper2010
Compulsory Schooling and the Family

 Compulsory Schooling, the Family, and the "Foreign Element," Evidence from the United States, 1880-1900.  2nd Heinz College Paper

Lingwall, Jeff
Article2010
Consolidation, Insurance Coverage & Tort Reform in Health Care Markets

Containing health care expenditures and extending health insurance coverage have been the most important health care policy issues in the U.S. Health care expenditures surpassed $2.3 trillion in 2008, more than three times the $714 billion spent in 1990 (Centers for Medicare and Medicaid Services (2010)), while more than one in five adults under age 65 (22%) was uninsured in 2009 (Kaiser Family Foundation (2010)). This dissertation investigates three topics related to either or both of these issues. In the rst chapter1, we analyze how market concentration in health care market affects the price of hospital services, which is an important factor that determines health care expenditures. In the second chapter, I propose a preliminary framework that enables us to explore which policy option extends coverage without imposing too much costs. The third chapter analyzes the effects of state tort reform, which has been considered as an effective mean to contain health care expenditures. The first chapter analyzes the relationship between insurer and hospital market concentration and the prices of hospital services. There has been substantial consolidation among health insurers and hospitals recently, raising questions about the effects of this consolidation on the exercise of market power. We use a national U.S. dataset containing transaction prices for health care services for over eleven million privately insured Americans. Using three years of panel data, we estimate how insurer and hospital market concentration is related to hospital prices, while controlling for unobserved market effects. We find that increases in insurance market concentration are significantly associated with decreases in hospital prices, while increases in hospital concentration are non-significantly associated with increases in prices. A hypothetical merger between two of five equally-sized insurers is estimated to decrease hospital prices by 6.7%. The second chapter employs a simulation approach to develop a model of health insurance coverage and to analyze the effects of the policies to extend health insurance coverage. The model is a microeconomic model of individual decision making with regard to health insurance coverage, health-care consumption, and insurer determination of premiums. I cal- ibrate the parameters of the model by simulated method of moments (SMM), using data from the Medical Expenditure Panel Survey (MEPS) in 2003. Introduction of a health insurance voucher in the individual health insurance markets is found to reduce the uninsured rate from 20.4% to 16.4%, whereas introducing the voucher in all health insurance markets reduces the rate to 9.2%. The change in social welfare is almost negligible under the former policy, whereas social welfare decreases by 5.9% under the latter. Mandating CDHP-type high-deductible plans decreases the rate of the uninsured to 13.2%, and social welfare in- creases by 21.4%. This study proposes a preliminary framework for the evaluation of policies for the uninsured. The third chapter analyzes the effects of one of tort reforms, caps on non-economic dam- ages, on the utilization and health outcomes among mothers and premature newborns, using hospital discharge data from year 2001 to 2007. Preliminary findings are as follows. Implementation of caps on non-economic damages is associated with 2:8% decrease in length of stay and 11:5% decrease in the usage of mechanical ventilation among premature newborns, and the caps are associated with 0:6 1:2% decrease in length of stay among mothers. There is no significant association between the implementation of the cap and the mortality of pre- mature newborns. By analyzing the effects of caps on the usage of specific procedures, this chapter contributes to the understanding of how physicians practice \defensive medicine" in obstetric and neonatal care.

Moriya, Asako
Presentations and Proceedings2010
Does Red Tape Hold Back Entrepreneurs? Evidence from Portugal.

Heinz College PhD Second Research Paper, May 2010

Venancio, Ana
Working Paper2010
Extracting subpopulations from large social networks

Extracting subpopulations from large social networks

Zhang, Bin
Working Paper2010
Impact of Medicare Part-D

Impact of Medicare Part D on Generic Prescription Rate in the Long Term Care Facilities

Jung, Changmi
Working Paper2010
Impact of Web Portal on Call Center: An Impirical Analysis

Impact of Web Portal on Call Center: An Impirical Analysis

Kumar, Anuj
Working Paper2010
Information Technology Adoption & Procedural Performance in Health Care Industry

Dissertation August 25, 2010

ABSTRACT:
The initial ideas concerning the usage and potential of informational technologies in health care can be traced back to the 1950s (Staropoli, 2008), whereas the earliest adoptions of practical systems by hospitals followed about a decade later, in the mid-1960s (Wager, Lee and Glaser, 2005). Since then, while significant advancements have happened in the adoption and usage of IT in health care, both in terms of scale and scope, such advancements have been considered slow and unaggressive compared to the same process in other industries (Bowser, 2005). At the same time, especially since the 1990s, the spending on IT in the health care industry has increased to a significant amount, both in terms of the total spending and the percentage of the output of the sector. However, as IT adoption and renovation have become an important part on many hospitals’ agenda, the debate over their effectiveness and efficiency has also come to the front stage, not only for the health care industry, but also for the policy makers from the government and the concerned public in general.

My dissertation work intends to contribute to the understanding of Health Care IT adoption and its impact. Although it is certainly beyond the scope of this project to reach any unambiguous conclusions to the above questions, I expect my research to add important insights to the existing literature. This dissertation examines two topics relevant to information technologies in health care industry. (1) The status and change of integrated health care delivery system level IT spending and hospital level IT adoption between 1999 and 2006. (2) The potential link between hospital level IT adoptions and quality as quantified by procedural performance measure. The two chapters of this dissertation cover those two topics respectively.

Shi, Yunfeng
Working Paper2010
International Co-Invention in Central and Eastern Europe: the Role of Foreign Direct Investment

This paper analyzes the impact of Foreign Direct Investments on the ability of inventors in Eastern Europe to absorb the most advanced technologies from the West after the fall of the Berlin Wall. 

Vincenzi, Marco
Working Paper2010
Issues in the Empirical Evaluation of the Black-White Mortality Gap

Issues in the Empirical Evaluation of the Black-White Mortality Gap

Hsu, Yu-Chieh
Working Paper2010
Promotion and Adverse Events

David, Guy, Sara Markowitz, and Seth Richards-Shubik. 2010. “The Effects of Pharmaceutical Marketing and Promotion on Adverse Drug Events and Regulation.” American Economic Journal: Economic Policy, 2(4): 1-25.

Richards-Shubik, Seth
Article2010
Revisiting Firm Size and Job Creation

This paper revisits the lively discussion of the relationship between firm size and job creation and the relationship between firm size and firm growth.
We find that the relationship between firm size and firm growth is mediated by the industry conditions: In declining or low-growth industries smaller firms grow faster than larger ones but that relationship reverses for faster growing industries.
This effect seems to be caused by a greater ability of larger firms to adapt to the economic climate. Small firms are always job creators
while large firms switch from job destroyers to job creators as industry conditions improve.
We also find that adapting firm growth to industry growth has important implications on firm survival.
These results have important implications for public policy and the theoretical foundations of firm growth.

Regateiro, Andre
Presentations and Proceedings2010
Structure, Tie Persistence and Event Detection in Large Phone and SMS Networks

"Structure, Tie Persistence and Event Detection in Large Phone and SMS Networks." Akoglu, Leman and Bhavana Dalvi. Carnegie Mellon University, 2010.

Working Paper2010
Sustaining Life -– The Role of Small Business Innovation Research Program (SBIR)

 Sustaining Life -– The Role of Small Business Innovation Research Program (SBIR)

Jordan, James
Working Paper2010
The True Impact of Scientific Research on Industrial Technology? A Reassessment of IT Patent Citations
Economic research based on surveys, interviews, and case studies indicates that the linkage between science and industry is significant and growing in importance over time. Statistical analysis of patent-to-paper citations also strongly supports the notion of a growing science-technology linkage. How- ever, this approach seems to suggest that the linkage is highly concentrated in the biotechnology, biomedical, and pharmaceutical domains, and comparatively weak everywhere else. This finding is not only inconsistent with previous research results, it is also difficult to reconcile with the widely held notion that the IT revolution, arguably the most significant development of our time, has grown, in no small part, thanks to important advances in IT-related scientific disciplines. This paper aims to address this apparent inconsistency. We argue that patent citations to papers in the IT industry are created in the context of a special structure of relationships between academic science and industrial R&D that gives rise to a chaining pattern in citations, which, in turn, obscures the true effect of science on technology. Our approach is inspired by a long-known phenomenon in the science bibliometrics literature called "Obliteration by Incorporation", which explains how scientific contributions become embedded in the pool of accepted knowledge of a field, while their sources gradually become forgotten by the community. We verify our claim using patent citation data for a sizable portion of US patents granted between 1983-1999. The results provide a new outlook on the effect of scientific research on industrial technology. Based on this out- look, we can create a ranking of scientific research organizations in terms of the impact of their research on industrial R&D
Presentations and Proceedings2010
To Be or Not To Be Linked on LinkedIn

Garg, Rajiv and Telang, Rahul. “To Be or Not To Be Linked on LinkedIn: Job Search Using Online Social Networks” NBER Summer Institute Working Paper 2012

http://ssrn.com/abstract=1813532

http://users.nber.org/~confer/2012/SI2012/PRIT/Garg_Telang.pdf

Garg, Rajiv
Krackhardt, David
Smith, Michael
Telang, Rahul
Working Paper2010
A Perspective on Regional Advantage that Focuses on the Individual Experience

The dissertation examines how individuals form expectations and beliefs about opportunities available in the place they live. Opportunities are a powerful motivator for human behavior in a place. Opportunities stimulate people to engage in social activities, initiate business ventures, pursue a better education, and other relevant behavior. Yet, no research in urban or regional development has empirically explored this concept at the individual level, its drivers and underlying mechanisms. The dissertation develops a model where “opportunity beliefs” areshaped both by objective information about economic conditions and resources in a place (i.e.employment, wages and housing costs) and by other place-specific characteristics, ranging fromits socio-demographic structure to its natural and recreational amenities. The model is testedusing a survey on place-related attitudes and beliefs conducted by the Gallup Organization in2006. The survey provides a dataset of 28,000 individuals located in various parts of the UnitedStates. The Gallup’s database has been integrated with actual city demographics and statisticsregarding each respondent’s city/location using data from the U.S. Census Bureau, the American Community Survey and other sources. This unique dataset combines individual characteristicsand responses and place-specific features.

Results from structural equation modeling estimation show that opportunity beliefs areshaped both by economic conditions of the place and by amenities such as restaurants, cafes, artgalleries as well as colleges and universities (collectively referred to as “symbolic amenities”).These factors’ effects are moderated by two important mechanisms that are: the perception ofsocial openness and of economic dynamism. Unemployment rates and other economic conditionshave a modest though significant effect on the perception of economic dynamism. Symbolicamenities instead affect both mediators, emerging as powerful “signals” that residents used toform ideas about the economic dynamism of a place. Finally, in the case of symbolic amenitiesthe analysis also shows a significant moderating effect by individual creativity.

Presentations and Proceedings2009
Another Look at the Relationship Between Education and Crime: Revisiting Lochner and Moretti

Heinz College PhD Second Research Paper April 2009

Yang, Dou-Yan
Working Paper2009
Applying a Group-based Trajectories Methodology to Measures of Patient Safety

Applying a Group-based Trajectories Methodology to Measures of Patient Safety

Ashwood, Jefferson
Working Paper2009
Balance Theory and Structural Faculitators for Planned Organizational Network Change

Hunter, Keith (2009) "Balance Theory and Structural Facilitators for Planned Organizational Network Change," INSNA Sunbelt XXIX, San Diego, CA

Presentations and Proceedings2009
Does Money Make the Entrepreneurial World Go Round?

Does Money Make the Entrepreneurial World Go Round?

Venancio, Ana
Working Paper2009
Does Web-based self service reduce telephone calls to call center: an Empirical Analysis

Does Web-based self service reduce telephone calls to call center: an Empirical Analysis

Kumar, Anuj
Working Paper2009
Doing the Time Causes the Crime? An Examination of the Dose-Response Relationship Between Time Served & Future Criminal Offending

Doing the Time Causes the Crime? An Examination of the Dose-Response Relationship Between Time Served & Future Criminal Offending

Snodgrass, Gregory
Working Paper2009
Essays on the Design and Evaluation of Information Technology-Enabled Interventions for Chronic Disease Risk Assessment and Communication

Essays on the Design and Evaluation of Information Technology-Enabled Interventions for Chronic Disease Risk Assessment and Communication

Harle, Christopher
Working Paper2009
Information Technology, Complementary Capital and the Transatlantic Productivity Divergence

This paper delves into the the role of Information Technology as a relevant driver of the productivity divergence between Europe and the United States in the late 1990s.

Vincenzi, Marco
Working Paper2009
Inter-Organizational Groups: A New Context for Examining the Triggers of Group Conflict

Inter-Organizational Groups: A New Context for Examining the Triggers of Group Conflict

Pearce, Brandi
Working Paper2009
MCMC Approach to Classical Estimation with Overidentifying Restrictions

MCMC Approach to Classical Estimation with Overidentifying Restrictions

Quintero, Luis
Working Paper2009
Measuring Information Diffusion in an Online Community

Garg, Rajiv; Smith, Michael D and Telang, Rahul. 2011. “Measuring Information Diffusion in an Online Community.” Journal of Management Information Systems 28 (2): 11–38.

http://mesharpe.metapress.com/openurl.asp?genre=article&issn=0742-1222&volume=28&issue=2&spage=11
 

Garg, Rajiv
Krackhardt, David
Krishnan, Ramayya
Smith, Michael
Telang, Rahul
Working Paper2009
New Suppliers & New Markets - Essays on the Global Pharmaceutical Industry - Working Draft of Proposal

New Suppliers & New Markets - Essays on the Global Pharmaceutical Industry - Working Draft of Proposal

Chatterjee, Chirantan
Working Paper2009
Pittsburgh’s Targeted Incubator: Taking Innovation to the Next Level

"Pittsburgh’s Targeted Incubator: Taking Innovation to the Next Level." James Jordan and Paul L. Kornblith. Science Progress (http://www.scienceprogress.org), Jan 2009.

Jordan, James
Article2009
RFID-Enabled Analysis of Care Coordination and Patient Flow in Ambulatory Care

RFID-Enabled Analysis of Care Coordination and Patient Flow in Ambulatory Care

Lin, Yi-Chin
Working Paper2009
The Changing Role of Race in the NBA: A Draft Ordering Approach

The Changing Role of Race in the NBA: A Draft Ordering Approach

Working Paper2009
The Effects of the Clean Air Act on Productivity in the Electricity Industry: Evidence from the U.S. 1938-1993 (joint with Karen Clay (CMU) and Joshua Lewis (University of Montreal))

 The costs of environmental regulations have been widely debated in the U.S. since the passage of the Clean Air and Water Acts beginning in the 1960s.  Using data from 1972-1993, a recent paper by Greenstone, List and Syverson (2012) showed large productivity losses in manufacturing from air quality regulations. The authors’ paper also highlighted the need for data spanning longer time periods, since it was impossible to follow plants during a significant portion of the period of regulatory change.  This study provides new evidence on the effects of regulation by digitizing and analyzing data on steam power electricity plants covering the period 1938-1993. Federal Power Commission data begins in 1938. Data in the Federal Power Commission Reports is very detailed and so will allow analysis of the effects of regulation on steam power plants that burn coal, natural gas, and oil.  By following specific plants over time, one can observe fuel switching by existing plants, fuel choices by new plants, and changes in output resulting from regulation-induced technological modifications.  These are relevant for understanding productivity losses in electricity generation in developing countries that are beginning to implement or strengthen enforcement of air quality regulations. 

Clay, Karen
Severnini, Edson
Working Paper2009
The Local Ecology of New Movement Organizations

The Local Ecology of New Movement Organizations

Knudsen, Brian
Working Paper2009
Three Essays on Enterprise Information System Mining for Business Intelligence

This dissertation proposal consists of three essays on data mining in the context of enterprise information system.

The first essay develops a clustering algorithm to discover topic hierarchies in text document streams. The key property of this method is that it processes each text documents only once and assigns it to the appropriate place in the topic hierarchy as they arrive. It is done by making a distributional assumption of the word occurrences and by storing the sufficient statistics at each topic node. The algorithm is evaluated using two standard datasets: Reuters newswire data (Rcv1) and MEDLINE journal abstracts data (Ohsumed). The results show that by using Katz's distribution to model word occurrences we can improve the cluster quality in majority of the cases over using the Normal distribution assumption that is often used.

The second essay develops a collaborative filter for recommender systems using ratings by users on multiple aspects of an item. The key challenge in developing this method was the correlated nature of the component ratings due to Halo effect. This challenge is overcome by identifying the dependency structure between the component ratings using dependency tree search algorithm and modeling for it in a mixture model. The algorithm is evaluated using a multicomponent rating dataset collected from Yahoo! Movies. The results show that we can improve the retrieval performance of the collaborative filter by using multi-component ratings. We also find that when our goal is to accurately predict the rating of an unseen user-item pair, using multiple components lead to better performance when the training data is sparse, but, when there is a more than a certain amount of training data using only one component rating leads to more accurate rating prediction.

The third essay develops a framework for analyzing conversation taking place at online social networks. It encodes the text of the conversation and the participating actors in a tensor. With the help of blog data collected from a large IT services firm it shows that by tensor factorization we are able to identify significant topics of conversation as well as the important actors in each. In addition it proposes three extensions to this study: 1) Evaluation of the tensor factorization approach by measuring its accuracy in topic discovery and community discovery, 2) Extension of the study by incorporating the blog reading data which is unique because it measures consumption of post topics, and 3) Study the interdependence of reading, posting, citation activity at a blog social network.

Presentations and Proceedings2009
Three Essays on Health Outcomes

This paper explores two topics related to health outcomes in the adult U.S. population. Over the past fifteen years, the United States has seen a decrease in overall health. Two potential reasons behind this deterioration in physical condition are the rise in overweight and obesity, and increases in adverse health outcomes for individuals requiring hospital stays. This research attempts to comprehend the extent to which these two causes are contributors in said health declines. The paper focuses on three aspects of health outcomes: (1) identifying who among the adult U.S. population has been affected by the obesity epidemic; (2) whether employment increases played a role in the rise of obesity in single women; and (3) do increases in nurse staffing reduce adverse health outcomes for the inpatient hospital population.

Presentations and Proceedings2009
Three Essays on Health Outcomes

Three Essays on Health Outcomes



We find that while all ages experienced an increase in the prevalence of those overweight and/or obese, the prevalence among young adults has grown at a faster rate than that of older age groups. We find that the increases in body mass index are primarily due to period effects, not cohort or age effects. From the ordered logistical regression analyses, we find that protective influence of factors such as education, income, and age on an individual’s body mass index have, decreased over time. The analyses suggest that the increase in the prevalence of those overweight or obese is a phenomenon experienced by all demographic groups in the U.S.



Chapter 2 uses data from a health survey, the Behavior Risk Factor Surveillance System, for the years 1994 through 2000 to analyze the effect of employment increases due to welfare reform (Temporary Assistance for Need Families) on body weight and stress in single mothers. A difference-in-difference approach is used to analyze the survey, supplemented by state-level welfare and unemployment information. Additionally, as concern exists regarding the presence of incorrect standard errors when using such a model, I attempt to adjust said errors by correcting the asymptotics. I find that even by using a health survey, welfare reform does result in significant employment increases for single mothers, especially minorities. I find no relationship between welfare reform and either body weight or stress. The analysis suggests that employment increases due to welfare reform did not contribute to the obesity epidemic in the adult U.S. women.



I continue my study of health outcomes in Chapter 3. Utilizing Office of Statewide Health Planning and Development data for 2000 through 2005, I uses the 1999 California nurse-topatient ratio legislation to analyze the effect of increases in nurse staffing on adverse patient health outcomes for general medical/surgical hospital units. This law provides a natural experiment where the treatment effect varies across hospital units depending on the difference between the unit’s pre-legislated ratio (in 2000) and the mandated ratio (in 2005). I use this variation as the basis for an instrumental variables methodology. I find that nurse-to-patient ratios increase significantly for hospital units that were most in need of raising their ratios to become compliant with the 2005 mandated ratios. However, I find no evidence that said increases had the desired effect of reducing adverse health outcomes in the population.



Additionally, I explore the systematic differences of various hospital sub-sets, and attempt to isolate which sub-sets are actually affected by the legislation. The two sub-sets that I use are by region and by type of hospital financial control. My results depict no evidence that adverse patient health outcomes were either positively or negatively affected for various sub-sets of the sample. Finally, as hospital management claim considerable financial concerns with implementation of this law, I analyze the effect of AB 394 on three economic indicators. My results suggest that the legislation had no significant economic consequences for those hospitals most affected by the law. However, I consider my economic results inconclusive, as I am unable to determine whether hospitals were forced to redistribute resources in order to comply with the law. If such financial restructuring did occur, there is the potential for unintended consequences that I have been unable to pick-up.



This dissertation explores two topics related to health outcomes in the adult U.S. population.Over the past fifteen years, the United States has seen a decrease in overall health. Two potentialreasons behind this deterioration in physical condition are the rise in overweight and obesity, andincreases in adverse health outcomes for individuals requiring hospital stays. My researchattempts to comprehend the extent to which these two causes are contributors in said health declines.



My dissertation focuses on three aspects of health outcomes: (1) identifying who among the adult U.S. population has been affected by the obesity epidemic; (2) whether employment increases played a role in the rise of obesity in single women; and (3) do increases in nurse staffing reduce adverse health outcomes for the inpatient hospital population.



Chapter 1 uses data from the National Health Interview Survey for years spanning 1976 and 2001 to present an age-period-cohort analysis of weight gain throughout the life course. This is a joint project with Beth Osborne Daponte, and has been published in the spring 2008 edition of Population Research and Policy Review



Working Paper2009
Three Essays on the Use of Claims Based Analysis for Evaluating Quality of Healthcare and Making Causal Inferences

Three Essays on the Use of Claims Based Analysis for Evaluating Quality of Healthcare and Making Causal Inferences

Working Paper2009
Converting Pirates without Cannibalizing Purchasers: The Impact of Digital Distribution on Physical Sales and Internet Piracy

With the rise of Napster, BitTorrent, and other tools facilitating Internet piracy, rights holders have understandably become very concerned with the development of strategies to mitigate the impact of piracy on sales. These tools fall into three general categories: litigation, countermeasures, and competition. The literature has addressed the effectiveness of the first two anti-piracy strategies. In this paper we address the third strategy using NBC’s decision to remove its content from Apple’s iTunes store in December 2007 as a natural shock to the legitimate supply of digital content. To address this question we collect two large datasets from Mininova and Amazon.com documenting the levels of piracy and DVD sales for both NBC and other major networks’ content around this event. We then analyze this data in a difference-in-difference model and find that NBC’s decision to remove its content from iTunes is causally associated with a 19.99% increase in the demand for NBC's pirated content. This is roughly equivalent to an increase of 92,612 downloads a day for NBC’s content. Moreover, we see no change in demand for NBC’s DVD content associated with this change.

Dhanasobhon, Samita
Smith, Michael
Telang, Rahul
Working Paper2008
Cues and Heuristics on Capitol Hill: Network-Based Shortcuts in Legislative Decision Making

Cues and Heuristics on Capitol Hill: Network-Based Shortcuts in Legislative Decision Making

Working Paper2008
Do Spreadsheet Errors Lead to Bad Decisions: Perspectives of Executives and Senior Managers

Spreadsheets are commonly used and commonly flawed, but it is not clear how often spreadsheet errors lead to bad decisions. We interviewed 45 executives and senior managers / analysts in the private, public, and non-profit sectors about their experiences with spreadsheet quality control and with errors affecting decision making. Almost all said spreadsheet errors are common. Quality control was usually informal and applied to the analysis and/or decision, not just the spreadsheet per se. Most respondents could cite instances of errors directly leading to bad decisions, but opinions differ as to whether the consequences of spreadsheet errors are severe. Some thought any big errors would be so obvious as to be caught by even informal review. Others suggest that spreadsheets inform but do not make decisions, so errors do not necessarily lead one for one to bad decisions. Still, many respondents believed spreadsheet errors were a significant problem and that more formal spreadsheet quality control could be beneficial.

Caulkins, Jonathan
Working Paper2008
Does Part-time Work during High School Affect Academic Outcomes?

A majority of American teenagers work during high school, constituting a large time commitment that pulls teens’ attention away from family, friends, school, and community ties. The issue of whether the possible costs to the other parts of teens’ lives outweigh the benefits from working has attracted extensive research, but no consensus exists about the relationship of part-time work on academic outcomes. This study uses the Bureau of Labor Statistics’ National Longitudinal Survey of Youth (NLSY) from 1997 to examine the impact of working part-time on high school dropout rates and grade point average at age 17. After age 16, federal legislation no longer limits the hours or duties of child labor. However, child labor legislation across different states limits the number of hours that teenagers are allowed to work. By exploiting this variation, the study examines whether a teen’s hours of work during the age 16 school year has a statistically significant relationship to his or her performance at school. Average hours of work did not have a statistically significant relationship with academic outcomes when using instrumental variables compared to statistically significant results when using ordinary least squares (OLS).

Yang, Dou-Yan
Working Paper2008
Drug Policy Research

Drug policy research is the application of policy analysis in the substance abuse domain with a level of rigor that merits publication in academic journals on the grounds that the methods and/or results can provide foundational insights upon which subsequent analyses might draw. Policy analysis in turn is an interdisciplinary field that strives to objectively and empirically understand the consequences of different public policy interventions, including both retrospective evaluation of past interventions and prospective projections of contemplated interventions. It is useful to distinguish three types of policy analysis:
1)Analysis of net effects on society as a whole (a "social planner’s perspective"),
2)Distributive analysis of effects on each significant group of stakeholders, and
3)Political analysis of what convergence of forces can push through a piece of legislation or other policy change.

Caulkins, Jonathan
Working Paper2008
Gap gets bigger

Meara, Ellen R., Seth Richards, and David M. Cutler. 2008. “The Gap Gets Bigger: Changes in Mortality and Life Expectancy by Education, 1981-2000.” Health Affairs, 27(2): 350-360.

Richards-Shubik, Seth
Article2008
Human Capital and Economic Development in India

The theoretical models of economic growth have underscored the role of human capital. The empirical analysis of growth for a broad group of countries shows that the school attainment has positive effect on growth. Many studies have found that a region’s growth is influenced by the initial level of human capital. Glaeser et al. (1995) find that human capital level in 1960 influences growth of the cities betIen 1960 and 1990. Similarly, Simon et al. (2002) found that cities that have higher level of human capital initially grow faster in the long run. The regional differences in level of human capital also explain geographic differences in firm formation rates with regions endoId with higher level of human capital having higher firm formation rates (Acs, 2004). This paper explores the role of human capital in India’s economic development.

Bagde, Surendrakumar
Working Paper2008
Initial conditions and Post-entry Performance: the Case of Indian Software Industry

This paper studies entrepreneurship and firm survival, using as empirical backdrop the Indian software service industry. This develops a model that has contrasting predictions about the impact of the scale of entry on firm performance, after conditioning on lagged output. Predictions of the model are tested. It is found that current size positively influences the firm survival but initial size has negative effect on firm survival. We also find that founders’ human capital measured by their education reduces annual hazard of exit. The firms with access to market have advantage as also the firms with experience in the software industry.

Arora, Ashish
Bagde, Surendrakumar
Working Paper2008
Modeling Measurement Error When Using Cognitive Test Scores in Social Science Research

In many areas of social science, researchers want to use latent measures of ability as independent variables. Often cognitive test scores are used to measure this latent trait. Many social scientists do not model the measurement error inherent in the test score. This paper introduces the Mixed Effects Structural Equations (MESE) model to model the measurement error when a cognitive test score is used as a measure of ability as an independent variable. Unlike the typical linear regression model, which ignores the error and produces biased regression coefficients, the MESE model assumes measurement error. . Unlike the typical errors-in-variables (EIV; Anderson, 1984) model which uses classic test theory (CTT) to model homoskadastic measurement error by ability, the MESE model uses item response theory to model heteroskadastic measurement error by ability. The IRT model handles the well-known identifiability issues of the EIV model.

Working Paper2008
No Such Thing As A Free Lunch: Invenstment, Technnological Upgrading, and Exports in Indian Pharmaceuticals

The Indian pharmaceutical industry’s exports began to exceed its imports in the late 1980s. Since then, exports have grown rapidly, and the leading Indian firms have become significant exporters of generic drugs to the most advanced markets, including the U.S. As the Indian pharmaceutical industry increases its R&D spending and innovative efforts, leading firms clearly hope to export new products and processes to the U.S. and other advanced markets. Because it constitutes a (rare) example of a high tech exporting industry in a developing country, the Indian pharmaceutical industry provides an interesting context in which to explore the relationship between exports and technological upgrading. We investigate these linkages in this paper. The received literature has suggested that the exposure to advanced country technologies achieved through exports should lead to technological improvements in the exporting firms’ products and processes. Researchers have generally tried to measure these improvements by looking for changes in exporting firms’ measured total factor productivity that could be ascribed to increase in exports. The conceptual association in the literature between technological learning or upgrading through exports and increases in TFP is so strong that the phrase "learning by exporting" has come to mean an increase in TFP following an increase in exports. It is found that there is not much learning effect (from exports) observed for the overall industry. Some apparent learning effect is observed for a section of the industry, but only for firms who appear to be technologically backward within the industry.

Chatterjee, Chirantan
Working Paper2008
Optimal Timing of Use vs. Harm Reduction in an SA Model of Drug Epidemics

A debate in drug policy rankles between proponents of use reduction and harm reduction. This paper presents a stylized two-state, one-control dynamic optimization model of this choice based on a social cost related definition of harm reduction, and parameterize it both for cocaine in the U.S. and for Australia's population of injection drug users. Static analysis of a binary choice between pure harm reduction and pure use reduction suggests that whether or not harm reduction is a good strategy can depend on various factors such as the particular drug, the country, the social cost structure, or the stage of the "epidemic". The optimal dynamic control version of the model involves boundary solutions with respect to the control variable with several switches in the optimal policy. The results have interesting interpretations for policy. Even for the U.S. parameterization, harm reduction turns out to have a potential role when drug use is either already pervasive or when use is so rare that there is no danger of explosive increases in initiation, but perhaps not when drug use is near a "tipping point". In contrast, in the parameterization for Australian IDU, where effective harm reduction tactics exist and budgetary cost for harm reduction measures are small, harm reduction appears preferable starting from any initial state. Furthermore, an interesting feature of our simple model is the occurence of indifference curves, consisting of points where the decision maker is indifferent between two transients that will approach the same steady state in the long run. These transients result in the same social cost for the decision maker, but are characterized by quite different optimal policies.

Caulkins, Jonathan
Working Paper2008
Product Customization and Customer Service Cost: An Empirical Analysis

A field study is conducted on a prominent US health insurance firm to examine how customizing a product affects firm’s cost to serve the customers through its call center. In the setting, the product is a complex health insurance policy. Firm incurs substantial cost in serving the customers through its call center, and adjudicating the claims using its information systems. Firm sells either standard products, or in some instances allows customer groups to customize their policy by including, modifying certain aspects of the policy. It is shown that the process of customization is such that it increases users’ familiarity with his/her coverage and improves the fit with his/her medical needs. This, in turn, reduces their incentives to call the firm’s call center for clarifications regarding their product coverage. In particular, we show that users with customized policies call 30% less frequently than users with standard plan even after controlling for their number of doctor /facility visits. Also shown that there is no difference in claim adjudication cost between a standard vs. customized policy. Overall, the results suggest that, customized products may be operationally cheaper to serve than standard products. Thus the paper provides a link between product features and the ex-post cost of serving them.
 

Kumar, Anuj
Working Paper2008
Redemption in the Presence of Widespread Criminal Background Checks

Recent advances in information technology and growing concerns about employer liability have combined to increase the demand for criminal background checks. Also, since over 14 million arrests are made each year, many individuals will have criminal history records. As a result, many individuals who have redeemed by avoiding involvement in crime and are seeking employment will be haunted by the record of a crime committed a long time ago, a record that may well indicate a low probability of future crime risk. It is known that the probability of recidivism declines with time "clean" since a last arrest, so that there is some T* such that someone with no arrest for T* years is of no greater risk than any demographically similar counterpart. The problem is that we have very little information on the appropriate value of T*, and how that value varies with the crime type of the earlier event (C1) and the offender’s age at that event (A1). This paper estimates the degree to which a past criminal record loses its relevance in terms of its ability to predict future offending. Data obtained from a state criminal-history repository allow us to estimate the hazard of rearrest. This paper looks for its intersection with the demographically appropriate age-crime curve, representing the risk of arrest for the general population. The findings of this research can contribute to the development of relevancy guidelines for the users of background-checking services and for policy makers interested in enhancing employment opportunities in developing regulations regarding the dissemination of such records.
 

Nakamura, Kiminori
Working Paper2008
Rural Electrification, Urban Growth, and Economic Development: Evidence from the Rollout of the U.S. Power Grid (joint with Joshua Lewis (University of Montreal))

 Between 1930 and 1970, the Southern United States experienced a striking transformation, moving from an economy characterized by low-wage rural employment to a modern industrialized economy. This paper examines the role that rural electrification played in this transition. We study whether increases in agricultural productivity associated with rural electrification lowered the demand for farm labor, allowing for a shift from the “Old South” to the “New South” (Wright, 1986). The empirical analysis relies on a newly digitized dataset of all major transmission lines in 1923 and 1962, along with detailed information on the timing of power plant openings. To begin, we study the impact of electrification on farm productivity – as measured by land values and average farm size – in a county-level fixed effects framework. Next, we use county-level information on rural and urban populations to examine the link between rural electrification and urban growth. This analysis is supplemented by a study of individual-level migration patterns, based on retrospective data from the census.  Our study contributes to our understanding of the Southern economic transition throughout this period, and the role of electricity as an engine of urban growth.

Severnini, Edson
Working Paper2008
Securing Their Future? The Role of Markets for Technology, Organization Capabilities and Opportunity Costs on business Model and Performance in the Information Security Market

This paper makes use of a unique dataset of the Information Security Market (ISM) to advance the understanding of the phenomenon of entrepreneurship. Its purpose is threefold: First, how the presence of a market for technology (MFT) conditions the choice of business model of firms and post entry performance of firms is examined. This paper also examines the industry effects of MFT, by exploring the influence of MFT on competition. Secondly, this paper explores how opportunity costs of entrepreneurs influence exit as well as success of the entrepreneurial venture started by entrepreneurs. Finally, this paper examines how different types of firm competencies condition the choice of business model and survival of firms.

Working Paper2008
The Glass Ceiling for Asian Americans: How Perceptions of Competence and Social Skills Explain Hiring Differentials

The Glass Ceiling for Asian Americans: How Perceptions of Competence and Social Skills Explain Hiring Differentials

Working Paper2008
The Impact of Web-Based Risk Calculators on Health Risk Perceptions and Information Processing

Every day, millions of Americans use the Internet to obtain health information. To satisfy this demand, organizations deliver a variety of content that promotes awareness and education and informs health- related decision making. Given advances in web technology, new statistical models of disease, and the shift towards shared patient decision making, these e-health services are increasingly complex. Through applications such as personal health records and "health risk calculators" Internet users can obtain personalized and interactive feedback about their current health state, model-based predictions about their future health, and tailored education about healthy behavior. While providing the public with more content to inform health-related decisions is an appropriate goal, research in health psychology and behavioral decision making suggest the importance of clearly understanding the perceptual and behavioral responses when laypersons are presented with statistical results and personalized risk information. Little research has studied how web-based personalized and interactive health applications actually impact the beliefs and behavior of users. In two separate experiments, this research measured the effect of a type 2 diabetes "risk calculator" website on user information processing and subjective risk perceptions about diabetes. In the first experiment, 100 middle-aged and elderly adults were randomized to one of three conditions in order to determine how personalized risk estimates and interactive risk feedback influenced information usage and beliefs about future diabetes onset. Results showed that personalization and interactive features did not lead to increases in information utilization as expected. Instead, this research shows in some cases personalization actually reduced the amount of information accessed and the extent to which users attended to and carefully considered health risk content. The experiment did show that personalization was related to modest increases in the accuracy of absolute diabetes risk estimates but did not motivate significant changes in relative or affective risk perceptions. A second study of 34 university staff members was qualitatively suggestive of similar results. Future work is needed to further understand the behavioral implications when complex statistical models are integrated with publicly available health information websites. This may aid the design of health information applications and ensure that providers of these tools are effectively motivating improved awareness and education about health risks.
 

Working Paper2008
Three Essays on Health Outcomes

 

This paper explores two topics related to health outcomes in the adult U.S. population. Over the past fifteen years, the United States has seen a decrease in overall health. Two potential reasons behind this deterioration in physical condition are the rise in overweight and obesity, and increases in adverse health outcomes for individuals requiring hospital stays. This research attempts to comprehend the extent to which these two causes are contributors in said health declines. The paper focuses on three aspects of health outcomes: (1) identifying who among the adult U.S. population has been affected by the obesity epidemic; (2) whether employment increases played a role in the rise of obesity in single women; and (3) do increases in nurse staffing reduce adverse health outcomes for the inpatient hospital population.

 

Working Paper2008
Three Essays on Information Security Policies

Information security breaches pose a significant and increasing threat to national security and economic well-being. In the Symantec Internet Security Threat Report (2003), companies surveyed experienced an average of about 30 attacks per week. Anecdotal evidence suggests that losses from cyber-attacks can run into millions of dollars. The CSI-FBI survey (2005) estimates that the loss per company was more than $500,000 in 2004 and more than $200,000 in 2005. This research analyzes the information security policies that attempt to address the above issues. In particular, this research focus on the following topics (1) the vulnerability disclosure policy of several major vulnerability information outlets and their implications to the vendors’ patch release behavior (2) the conformance of the software vendors to one of the most important software product security quality certification standard, Common Criteria certification (3) the effectiveness of Common Criteria Certification in improving the security quality of software products.

Yang, Yubao
Working Paper2008
Understanding Inertia: Inherent Limitations on Evaluating "Upstream" Prevention Interventions

When different types of policy interventions are available, there is an understandable desire to evaluate all alternatives using common metrics so scare resources can be allocated in the most efficient manner. However, systems that display significant lags in their response to some interventions can confound such an empirical approach. This paper provides a parsimonious mathematical representation of some of the challenges confronted when trying to evaluate upstream interventions on lagged systems to help clarify when it is and when it is not practical to expect those interventions to meet the same standard of proof as downstream interventions. Implications for drug policy and delinquency prevention are elaborated.
 

Caulkins, Jonathan
Working Paper2008
Using Receiver Operating Characteristic Analysis to Evaluate Large-Change Forecast Accuracy

This paper applies receiver operating characteristics (ROC) analysis to M3 Competition, micro monthly time series for one-month-ahead forecasts. Using the partial area under the curve (PAUC) criterion as a forecast accuracy measure and paired-comparison testing via bootstrapping, we find that complex methods (AutomatANN, Flores-Pearce2, Forecast ProSmart FCS, and Theta) perform best for forecasting large declines in these time series, which tended as a group to decline over time. A regression model of PAUC on a judgmental index for forecast method complexity provides further confirming evidence. We also found that a combination forecast, consisting of the median value of the top three methods, to perform better than the component methods, although not statistically so. The classification of top methods matches that obtained using conventional forecast accuracy methods in the M3 Competition complex
methods forecast these series better than simple ones.

Gorr, Wilpen
Working Paper2008
All Reviews Are Not Created Equal: The Dissagregate Impact of Reviews and Reviewers at Amazon.com

Online product review networks play an important role in Internet commerce by transmitting information that customers can use to evaluate physical products in a digitally mediated marketplace. These networks frequently include an explicit social component allowing consumers to view both how community members have rated individual product reviews and the social status of individual reviewers. Moreover, the prior literature has not analyzed the impact of these social cues on consumer behavior, focusing instead on the impact of aggregate review ratings. This work extends this prior work by analyzing how these social factors impact consumer responses to disaggregate review information. To do this, a new dataset collected from Amazon.com’s customer reviews of books is used. This dataset allows to control for the degree to which other community members found the review helpful, and the reputation of the reviewer in the community.

Dhanasobhon, Samita
Smith, Michael
Working Paper2007
An Information Visualization Approach to Classification and Assessment of Diabetes Risk in Primary Care

Chronic disease risk assessment is a common information processing task performed by primary care physicians. However, efficiently and effectively integrating information about many risk factors across many patients is cognitively difficult. Methods for visualizing multidimensional data may augment risk assessment by providing reduced-dimensional displays which classify patient data. This study develops a framework which combines medical evidence, statistical dimensionality reduction techniques, and information visualization to develop visual classifiers for the task of diabetes risk assessment in a population of patients. The framework is evaluated in terms of classification accuracy and medical interpretation for two case studies, prediction of type 2 diabetes onset and prediction of heart attacks in adults with type 2 diabetes.

Working Paper2007
Cosponsorship in the U.S. Senate: A Multilevel Approach to Detecting the Subtle Influence of Social Relational Factors on Legislative Behavior

Cosponsorship in the U.S. Senate: A Multilevel Approach to Detecting the Subtle Influence of Social Relational Factors on Legislative Behavior
 

Working Paper2007
Economic and Statistical Models for Affordable Housing Policy Design

Prescriptive planning models for affordable and subsidized housing policy design that have a local focus and which are intended to reflect aspects of current and/or best practices require detailed estimates of various structural parameters. These estimates should ideally reflect observations of actual housing units, households and development projects. We examine here three classes of structural parameters: dollar-valued benefits and costs of affordable/subsidized housing provision, correlates of measures of housing market strength, and locational outcomes of households participating in housing mobility programs. Current results for dollar-valued housing impacts—provision costs, household benefits and subsidy levels—are based on detailed administrative data from a local housing provider and generate promising forecasting models for use in regional-level planning models. Results for the remaining two classes of structural parameters are currently under development.

Johnson, Michael
Working Paper2007
Energy Mix and Political Affiliation: Evidence from U.S. States Since 1960 (joint with Karam Kang (CMU))

 The two major political parties in the U.S. appear to take opposite views on energy and environmental policies, such as whether CO2emissions should be regulated under the Clean Air Act. We intend to investigate the historical relationship between the energy mix and the political environment related to party affiliation of state government in continental U.S. If we find that a specific political environment affects the role of fossil fuel in the energy mix, measured in terms of share of fossil-fuel installed capacity and electricity generation, we question how this impacts the distribution of emissions across the nation. Further, we study how political environment affects the decisions of firms regarding where to open a new plant or expand existing ones.

Severnini, Edson
Working Paper2007
Essays on IT-Mediated Phenomena: IT Knowledge Management, Mobile Telecommunication, and E-Commerce

This dissertation is intended to widen our understanding of three issues among IT-mediated phenomena observed after the revolution of social structure accompanied by the "paradigm shift": (1) learning behavior of IT knowledge workers, (2) users’ consumption behavior of wireless communication services, and (3) consumer’s online shopping behavior, from the standpoint of research domain. For the perspective of research methodology, all four essays constituting the dissertation are developed based on an economic and econometric analysis. While the essays seem to be loosely tied together in either research methodology or research domains, they all aim to better understand the human intentional or adapted behaviors in everyday IT-driven socio-economic environment.

Working Paper2007
Fundamental Patent Reform and the Private Returns to R&D - The Case of Indian Pharmaceuticals

How do private returns to inventive activity change when IPR regimes are substantially strengthened? This paper investigates this question by looking at the impact of patent reforms in India on India-based pharmaceutical companies. In a fundamental policy shift, India agreed to introduce product patents for pharmaceuticals when it signed the WTO TRIPS treaty in 1995. This policy came into effect through an enabling legislation in 2000 and a final implementation in 2005. The dataset is a panel of 315 pharmaceutical firms from 1990 to 2005. Private returns of a firm are measured using a hedonic stock market valuation of the tangible total assets (A) and intangible inventive assets (K). The intangible assets are measured by stocks of R & D expenditure at various literature specified depreciation rates. We normalize our intangibles with total assets, while using controls like firm sales and aggregate industry dummies in our estimations. The analysis covers stratified industry subsets and watershed periods to capture effects of regime changes. The method of estimation involves pooled OLS regressions with time dummies and fixed effects to account for firm-specific unobserved heterogeneity. We also use non-linear least squares with first differences as a robustness check for our results. The findings reveal a monotonic increase in private returns to inventive activity, with returns peaking around 2005, the year in which product patents were introduced in India. An increase in depreciation rates of R & D implying higher obsolescence of R & D activities results in increasing returns to R & D for various subsets of the industry. This provides early evidence of the markets positively valuing more recent R & D activities as firms shift their research capabilities with changes in the patent regime.
 

Chatterjee, Chirantan
Working Paper2007
Implications of Inertia for Assessing Drug Control Policy: Why Upstream Interventions May Not Receive Due Credit

There is ongoing interest in assessing the effectiveness of various drug control strategies, including policy intended to reduce initiation and prevalence. Compartmental models of trajectories of drug use have been developed that demonstrate that drug "systems" display significant inertia; interventions on systems with high inertia can be difficult to evaluate. The implications of inertia are illustrated by combining a new empirically-derived model of national drug initiation with a compartment model of trends in illicit drug use parameterized for Australia.
 

Caulkins, Jonathan
Working Paper2007
Is Objective Risk All That Matters When It Comes to Drugs?

Mokdad et al. (2004) estimate that each year in the United States, 435,000 people die from tobacco use, 85,000 from alcohol, and 17,000 from all illicit substances combined. Yet American public appears far more concerned about illegal drugs than it is about tobacco and alcohol use, driving expansions in control efforts far beyond that which is part and parcel of prohibition. The central thesis of this paper is that some of this mismatch in concern may stem from differences in the types of deaths created, with deaths associated with illicit drugs being, on average, "scarier" to the public than are the deaths associated with legal substances in a way that can be grounded in the risk perception and communication literatures. We summarize literature documenting that people care about more than actual death risk. Factors such as voluntariness, control, and familiarity also play a crucial role in determining the perceived risk of an event, and some of those factors seem to be more salient for the illicit drugs than for tobacco and alcohol. Social amplification of risk may also play a role in explaining these perceptions, but may not by itself be the full explanation.
 

Caulkins, Jonathan
Working Paper2007
Mathematical Models for Reconstruction Planning in Urban Areas

Mathematical Models for Reconstruction Planning in Urban Areas

Johnson, Michael
Jung, Changmi
Working Paper2007
Might Randomization in Queue Discipline Be Useful When Waiting Cost is a Concave Function of Waiting Time?

This paper raises the question of whether some degree of randomization in queue discipline might be welfare enhancing in certain queues for which the cost of waiting is a concave function of waiting time, so that increased variability in waiting times may be good not bad for aggregate customer welfare. Such concavity may occur if the costs of waiting asymptotically approach some maximum (e.g., for patients seeking organ transplants who will not live beyond a certain threshold time) or if the customer incurs a fixed cost if there is any wait at all (e.g., for knowledge workers seeking a service or piece of information that is required to proceed with their current task, so any delay forces them to incur the "set up charge" associated with switching tasks).
 

Caulkins, Jonathan
Working Paper2007
On Data Quality and Risk in Guideline Based Clinical Decision Support

Guideline based clinical decision support systems provide patient-specific medical guidance to physicians, often at the point-of-care. A large body of research shows that these systems have the potential to reduce practice variation and human error. However, there is also evidence suggesting that these systems may introduce unintended risk into the medical-decision making process. The poor quality of data in medical records and databases poses one such risk. As a result, appropriately assessing the magnitude of the risk posed by data quality is an important, but difficult problem. The nature of this risk depends on several complex and interrelated factors.  This paper provides a novel framework that explicitly models the nature of data, errors, and how guideline based clinical decisions support systems process information and produce guidance. The framework gives the decision-maker the ability to assess how uncertainty about data quality translates into the risk of negative medical consequences and determine which data elements are most critical for minimizing this risk. The results of the framework can inform both efficient data-quality improvement and risk minimization strategies.

Working Paper2007
Three Essays in Industrial Organization of Health Care: Exploring Entry, Exit and Ownership Effects

This paper analyzes three issues of public policy within the U.S. health care market. In doing so, it examines several aspects of provider and payer behavior, and ways in which that can affect the health care sector.

Working Paper2007
Three Essays on Local Labor Markets

This paper explores the intersection of labor economics and urban-regional economics. An important stream of research in urban economics documents large cross-city variation in prices, wages, and other attributes and seeks to provide theoretical explanations for these observations. Standard economic theory predicts that this variation in the vector of prices will affect decision-making by rational individuals, including decisions relevant to local labor markets. The research is aimed at understanding the importance of locationspecific attributes-such as local wages, local prices (e.g., real estate prices), and commuting times-in studying labor market outcomes.

Working Paper2007
Three Essays on Prescription Drug Insurance Benefit Design: The Effects of Prescription Drug Cost-Sharing on Health Spending, Compliance, and Health Outcome

In the past 15 years, national spending on prescription drugs has grown dramatically, outpacing the growth rate of hospital spending and physician spending in the same period. In response, many health insurance plans have reduced the generosity of their prescription drug benefits by imposing increasingly higher cost-sharing on patients' side for their pharmaceutical uses. This research paper contains in-depth evaluations of the impacts of prescription drug insurance benefit design on health spending, consumers' use of medical care, compliance with drug regimens, and health outcomes.

Working Paper2007
Three Essays on the Modeling of Development

This dissertation is intended for the advancement of methodology and techniques used in the modeling of development. There are three loosely tied essays of which this contribution is comprised.
The first chapter is entitled ‘Finite Sample Effects in Group-Based Trajectory Models." It analyses a very intricate and specific aspect of the broader body of work credited to Daniel Nagin regarding group-based trajectory models. These models, which are an application of finite mixture modeling, are used to model population heterogeneity in the development of various types of behavior such as physical aggression or anxiety over age or time.
The second chapter is entitled "Consequences of a Violation of the Conditional Independence Assumption in Group-Based Trajectory Models." This chapter again addresses another nuance of the group-based trajectory model, which, in particular, is often a criticism of the methodology. Specifically, in the specification of these groupbased models trajectory models, the following assumption is made.

The third chapter is entitled "Accounting for Selection to Understand the Effects of Group Daycare on the Development of Physical Aggression." This chapter considers the problem of estimating casual effects of a treatment on an outcome of interest, which is a problem often addressed by researchers.

Working Paper2007
Three Essays on the Modeling of Development in Criminology and Psychopathology

This dissertation is intended for the advancement of methodology and techniques used in the modeling of the development of behaviors associated with criminology and psychopathology. There are three loosely tied essays of which this contribution is comprised.
The first chapter is entitled ‘Finite Sample Effects in Group-Based Trajectory Models." It analyses a very intricate and specific aspect of the broader body of work credited to Daniel Nagin regarding group-based trajectory models. These models, which are an application of finite mixture modeling, are used to model population heterogeneity in the development of various types of behavior such as physical aggression or anxiety over age or time.
The second chapter is entitled "Consequences of a Violation of the Conditional Independence Assumption in Group-Based Trajectory Models." This chapter again addresses another nuance of the group-based trajectory model, which, in particular, is often a criticism of the methodology.
The third chapter is entitled "Accounting for Selection to Understand the Effects of Group Daycare on the Development of Physical Aggression." This chapter considers the problem of estimating casual effects of a treatment on an outcome of interest, which is a problem often addressed by researchers.

Working Paper2007
Using Cognitive Test Scores in Social Science Research

A standard problem in social science attempts to better understand the large wage disparities between black and white workers in U. S. labor markets. Social scientists have conducted hundreds of studies of observed racial wage gaps, seeking to understand the extent to which they are driven by differences in human capital or disparate treatment by employers. In order to get an unbiased estimate of such effects, it is necessary to include in the regression equations measures of human capital. While years of schooling has traditionally been used as a measure of human capital, social scientists are increasingly turning to cognitive test scores, as a more direct measure. Most social science research that uses cognitive test scores as an independent variable models the test score as fixed and without error. However, since test scores have measurement error, modeling the test score in this way can produce biased results which can result in incorrect policy conclusions. Current methods for modeling the test score with error are limited to single point in time analysis with a fixed cognitive assessment administered to all subjects, and situations in which the measurement error is homogeneous across all subjects. In response to these drawbacks, a new model called the Mixed Effects Structural Equations (MESE) model is developed. The MESE model is demonstrated using data from the National Adult Literacy Survey by analyzing black-white wage gaps in married men, single men, and single women. Three important findings are of note. First, much of the black-white wage gap can be attributed to a black-white disparity in skills suggesting that more attention ought to be focused on the development of skills. Second, comparisons of the the MESE model to a model with no measurement error demonstrate the importance of modeling the measurement error. Third, comparisons of the MESE model to a model using current methodology suggest the MESE model may solve some of the drawbacks noted in the other current methods.

Working Paper2007
A Note on the Trade-off between Ecosystem Preservation and Air Quality: Evidence from Hydroelectric Licensing Rules

 Do environmental regulations aimed at preserving natural ecosystems really protect the environment? The answer seems to be not really. I present suggestive evidence that, while hydroelectric licensing rules conserve the wilderness and wildlife by restricting the development of hydro projects, they lead to more greenhouse gas emissions. Basically, land conservation regulations give rise to a replacement of hydropower, which is a renewable, non-emitting source of energy, with conventional fossil-fuel power, which is highly polluting. I find that, on average, each megawatt of hydropower that is not developed because of these regulations induces the same amount of carbon dioxide that a U.S. coal-fired plant would emit in producing a megawatt of electricity. Environmental regulations focusing only on the preservation of ecosystems appear to encourage electric utilities to substitute dirtier fuels for electricity generation.

Severnini, Edson
Working Paper2006
Cost-Benefit Analyses of Investments to Control Illicit Substance Abuse and Addiction

This paper gives an overview of what is known concerning illicit drug control interventions’ “return on investment” performance from a social planner’s perspective. It is organized by broad type of intervention (supply control, prevention, treatment, harm reduction, and integration across intervention types). The discussion is primarily US-centric, with somewhat greater reliance on international literature vis a vis harm reduction.

Caulkins, Jonathan
Working Paper2006
Decision Models for Affordable Housing and Sustainable Community Development

Researchers in urban housing and community development face significant challenges in evaluating the success of efforts to improve urban neighborhoods, and identifying underlying theories that might predict the success of future initiatives. Practitioners in this field confront political considerations, restrictive administrative guidelines and limited funding. In the face of these challenges, prescriptive decision models have the potential to improve policy responses to challenges such as affordable housing, race and class segregation, ineffective and/or inequitable economic development, and urban sprawl. This paper first reviews previous research in these areas across multiple disciplines and identify important limitations and modeling opportunities. It then describes recent work on tools and methods to assist policymakers to design effective and sustainable housing and community development strategies. These strategies are based on explicit values of maximization of social welfare and social equity and use best-available evidence regarding impacts of housing and community development policies on program participants and non-participants. The discussion is animated by a case study of a hypothetical affordable housing policy initiative for a diverse metropolitan area that we analyze from three perspectives. The first is a long-term and national perspective in which stylized policy models provide insight into large-scale implementation of this program. The second perspective is the medium- term and regional, in which detailed planning models and decision support systems provide specific guidance to housing providers. The third perspective is the short-term and local, in which decision support systems assist individuals’ choice of specific alternatives as defined by housing initiatives. These complementary modeling perspectives are shown to provide technical and policy insights that differ significantly from, and improve on, those associated with conventional methods.

 

Johnson, Michael
Working Paper2006
Empirical Calibration of Time Series Monitoring Methods Using Receiver Operating Characteristic Curves

Time series monitoring methods, such as the Brown and Trigg methods, have the purpose of detecting pattern breaks in time series data reliably and in a timely fashion. Traditionally, researchers have used the average run length statistic (ARL) on results from generated signal occurrences in simulated time series data to calibrate and evaluate these methods, with a focus on timeliness of signal detection. This paper investigates the receiver operating characteristic (ROC) framework, well-known in the diagnostic decision making literature, as an alternative to ARL analysis for time series monitoring methods. ROC analysis traditionally uses real data to address the inherent tradeoff in signal detection between the true and false positive rates when varying control limits. We illustrate ROC analysis using time series data on crime at the patrol district level in two cities and use the concept of Pareto frontier ROC curves and reverse functions for methods such as Brown's and Trigg's that have parameters affecting signal-detection performance. We compare the Brown and Trigg methods to three benchmark methods, including one commonly used in practice. The Brown and Trigg methods collapse to the same simple method on the Pareto frontier and dominate the benchmark methods under most conditions. The worst method is the one commonly used in practice.

Cohen, Jacqueline
Garman, Samuel
Gorr, Wilpen
Working Paper2006
Engineering-Based Methods for Affordable Housing and Sustainable Community Development

The purpose of this paper is to highlight new and creative research in a variety of disciplines- especially decision sciences-that help determine when, where, what type and by what means affordable housing and sustainable communities might be built, redeveloped and maintained. As a prelude, it is useful to draw a link between housing planning and supply chain management, the theme of the Frontiers of Engineering session in which this paper appears.

Johnson, Michael
Working Paper2006
Exploring Differences in Estimates of Visits to Emergency Rooms for Injuries from Assaults Using the NCVS and NHAMCS

Researchers seeking to provide a better understanding of crime statistics tend to compare survey-based statistics such as the NCVS with data from police administrative series like the UCR. Because these two types of data collections systems are so different, simple direct comparisons are of little value regarding limitations inherent to a particular data collection system. This chapter explores the NCVS data using a different perspective that compares data from the national crime survey of population with those from a national survey of establishments-the National Hospital Ambulatory Care Survey (NHAMCS). This comparison provides an understanding of how the design, instrumentation and procedures of the NCVS may influence estimates of interpersonal violence, particularly that component of violence resulting in injuries treated in hospital emergency rooms. The estimates of emergency room visits for injuries due to violence obtained from the NCVS are considerably smaller than those from the NHAMCS. The analyses include a series of adjustments to these estimates that explore the role of features specific to each survey in the observed differences. The household sampling frame employed in the NCVS receives special attention as a potential source of the observed differences. Investigating this source of divergence is particularly important, since many of our major social indicators on the economy and participation in government programs depend upon household surveys. If some population groups are under-represented in the household sampling frame used in Census surveys, and this under-coverage results in underestimates of violence, this finding could have implications for the use of the household frame to estimate the magnitude of other problems that disproportionately affect marginal populations, such as unemployment, poverty, drug abuse and poor health status. The first section of this paper describes the two surveys, but principally the NHAMCS, since the NCVS is described extensively in Chapter 2. The second section presents the unadjusted estimates of the rate of emergency room visits due to violent crime from the two surveys. The third section outlines a series of potential explanations for the observed rate differences and the last section includes a series of adjustments to the rates designed to test the plausibility of the various explanations.

Cohen, Jacqueline
Working Paper2006
Heroin and Methamphetamine Seizures in Victoria, Australia: Purity Changes Associated with the Heroin "Drought"

The Australian heroin "drought" was a singular event deserving of the considerable scholarly attention it has engendered. The best way to understand market disruption is to examine both supply and demand side indicators, yet data on the former have been relatively neglected. Here we explore a rich data set on heroin and methamphetamine purity from 1998-2002 in Victoria that support monthly and even fortnightly time series. These series show that the drought was characterized by abrupt and substantial declines in heroin purity (from ~40% to as low as 10-15%), but those steep declines followed an extended period of substantial erosion in purity (from 70-75% in early 1999 to ~40% by the end of 2000). Purity rebounded from its post-drought lows but far from completely, stabilizing at ~20% for 2002. The heroin purity declines do not appear to stem from “cutting” at lower market levels. The declines did increase the purity variability per pure unit of heroin. There was no comparable evidence of contemporaneous effects in the methamphetamine purity series.

Caulkins, Jonathan
Godkin, Caroline
Working Paper2006
Incentive Stackelberg Strategies for a Dynamic Game on Terrorism

This paper presents a dynamic game model of international terrorism. The time horizon is finite, about the size of one presidency, or infinite. Quantitative and qualitative analysis of incentive Stackelberg strategies for both decision-makers of the game ("The West" and "International Terror Organization") allows statements about the possibilities and limitations of terror control interventions. Recurrent behavior is excluded with monotonic variation in the frequency of terror attacks whose direction depends on when the terror organization launches its terror war. Even optimal pacing of terror control operations does not greatly alter the equilibrium of the infinite horizon game, but outcomes from the West’s perspective can be greatly improved if the game is only "played" for brief periods of time and if certain parameters could be influenced, notably those pertaining to the terror organization’s ability to recruit replacements.

Caulkins, Jonathan
Working Paper2006
Leveraging Social Networks To Motivate Individuals to Reduce their Ecological Footprints

What role can social networking websites play in supporting large-scale group action and change? This paper proposes to explore their use in supporting individual reduction in personal energy consumption. Here, some existing uses of social networking on the web and propose an approach that integrates feedback about ecological footprint data into existing social networking sites and Internet portal sites are summarized. Integrating such feedback into popular, commonly used sites allows frequent feedback about performance, while enabling the exploration motivational schemes that leverage group membership. We propose to compare different motivational schemes in three ways: Reduction in CO2 emission; lifestyle changes; and ongoing use by users who join the site (retention).

Johnson, Michael
Working Paper2006
Optimizing Counter-Terror Operations: Should One Fight Fire with "Fire" or "Water"?

This paper deals dynamically with the question of how recruitment to terror organizations is influenced by counter-terror operations. This is done within a optimal control model, where the key state is the (relative) number of terrorists and the key controls are two types of counter-terror tactics, one ("water") that does not one ("fire") that does provoke recruitment of new terrorists. The model is nonlinear and does not admit analytical solutions, but an efficient numerical implementation of Pontryagin’s Minimum Principle allows for solution with base case parameters and considerable sensitivity analysis. Generally this model yields two different steady states, one where the terror-organization is nearly eradicated and one with a high number of terrorists. Whereas water strategies are used at almost any time, it can be optimal not to use fire strategies if the number of terrorists is below a certain threshold.

Caulkins, Jonathan
Working Paper2006
Practitioner Perspectives on Affordable Housing Policy Design: What Role for Prescriptive Models?

Urban affordable housing planning and development is a challenging enterprise. Practitioners, which range from small neighborhood community development corporations to county-level agencies to nonprofit developers operating at the regional level, routinely solve problems involving multiple stakeholders, competing objectives, funding sources, production processes, strategies and outcome measures. Academic research addressing the design and evaluation of policies that address community concerns and generate significant social impacts is limited. Using surveys and in-depth interviews with affordable housing providers in the Pittsburgh metropolitan area, this paper describes current practices and organizational values and test hypotheses that address the relationships between organizational characteristics, neighborhood characteristics and residential real estate development choices and methods. These findings are used to formulate quantitative planning models that address two problems faced by providers: choice of parcels to develop in order to maximize the social benefit associated with low-income housing search, and choice of parcels to develop to maximize neighborhood-level benefits.
 

Fisher, Meredith
Johnson, Michael
Working Paper2006
Public Participation and Decision Support Systems: Theory, Requirements, and Applications

Public-sector policy design is inherently complex: there are multiple stakeholders, multiple objectives, and ambiguous or unknown evidence regarding effective policy interventions. One response to this problem is the design of decision support systems (DSS) to help ordinary citizens as well as policymakers make choices that enrich their daily lives and participate in policy and planning processes. DSS typically combine information technology tools for data storage, analysis, and visualization with decision models that help users formulate and solve problems that might otherwise appear impossible to formulate clearly or are computationally intractable. This paper proposes a framework for developing DSS for public sector policy making that reflects a number of key principles. First, DSS should be values-based-reflective of ethical and moral considerations that motivate users to address problems of public interest. Second, it should be evidence-based-containing data and functional relationships that represent best knowledge and practices. Third, it should be model-based-containing representations of real-world systems that generate actionable recommendations based on multiple choice alternatives. Last, it should facilitate creativity and negotiation-enabling multiple stakeholders to collaborate and explore "what-if" questions easily and identify "best-compromise" solutions quickly. Evidence to support this theory of public DSS comes from current projects on increasing citizen participation in initiatives to reduce energy use, policy design for senior services provision, and choosing new neighborhoods for use of tenant-based housing subsidies.
 

Johnson, Michael
Working Paper2006
The Davao City Health System: An Approach to Optimally Locating Community Health Facilities

Two hierarchical location-allocation models are used to locate two types of communitybased health facilities in Davao City, Philippines. Oftentimes, locating health facilities are motivated by factors such as politics and resource availability. Different optimization approaches to locating a mix of these health facilities across the rural and urban areas of the city. Computations results are evaluated based on three performance metrics: operating costs, average weighted travel distance and population coverage.

Johnson, Michael
Silva, Esmeralda
Working Paper2006
The Indian Software Industry and Its Prospects

India’s emergence as a major exporter of software services in less than a decade and a half has excited debate about the causes of its success and ignited hopes for similar success in other industries. The subsequent growth of exports of other business services appears to validate the belief of some observers that India’s software success would have broader benefits for the Indian economy. Despite this, there is a perennial undercurrent of concern about the prospects of the Indian software industry. The causes for concern are not difficult to find. Wages for software professionals have consistently risen year over year and employee attrition remains a persistent problem for companies. Indian exports continue to be mostly services with a modest technology content and there is little evidence of successful product development. Add to these the ever present possibility of China (or Eastern Europe or the Philippines) emerging as potent rivals, and there is much to be concerned about. This paper will briefly describe its growth and evolution, identify the major factors that contributed to its success, and some possible ones that were not important. The prospects of the industry are next and in this context this paper shall summarize the available evidence on the extent to which India and Indian firms are participating in software innovation. This will lead to an assessment of whether the industry has and can provide higher value added products and services.

Arora, Ashish
Working Paper2006
Using Integer Programming to Optimize Investments in Security Countermeasures: A Practical Tool for Fixed Budgets

Software engineers and businesses must make the difficult decision of how much of their budget to spend on software security mitigation for the applications and networks on which they depend. This article introduces a novel method of optimizing, using Integer Programming (IP), the combination of security countermeasures to be implemented in order to maximize system security under fixed resources. The article describes the steps involved in our approach, and discuss recent results with a case study client.
 

Caulkins, Jonathan
Working Paper2006
What Do We Know About Competition and Quality in Health Care Markets?

The goal of this paper is to identify key issues concerning the nature of competition in health care markets and its impacts on quality and social welfare and to identify pertinent findings from the theoretical and empirical literature on this topic. The theoretical literature in economics on competition and quality, the theoretical literature in health economics on this topic, and the empirical findings on competition and quality in health care markets are surveyed and their findings assessed.

Gaynor, Martin
Working Paper2006
A Genetic Algorithm for the Home-Delivered Meals Location-Routing Problem

Home-delivered meals (HDM) provision is a volunteer-staffed activity for which little strategic planning is currently performed. This paper presents and evaluates a Genetic Algorithm to solve the HDM location routing problem (LRP). This planning model addresses facility location, allocation of demand to facilities, and design of delivery routes, while balancing efficiency and effectiveness considerations. We provide computational results on benchmark LRP instances.
 

Johnson, Michael
Roehrig, Stephen
Working Paper2005
Better Safe than Sorry: Precautionary Reasoning and Implied Dominance in Risky Decisions

In four studies, student and nonstudent participants evaluated the possible outcomes of binary decisions involving health, safety, and environmental risks (e.g., whether to issue a dam-failure evacuation order). Many participants indicated that false positives (e.g., evacuation, but no dam failure) were better than true negatives (e.g., no evacuation and no dam failure), thereby implying that an incorrect decision was better than a correct one, and that the more protective action dominated the less protective action. A common rationale for this response pattern was the precautionary maxim "Better safe than sorry". Participants apparently evaluated outcomes partly on the basis of the decisions that might lead to them, in conflict with consequentialist decision models. Consistent with this explanation, the prevalence of implied dominance decreased substantially when the emphasis on decisions was reduced. These results demonstrate that initial preferences for decision alternatives can seriously bias the evaluation of consequences in risky high-stakes decisions.

DeKay, Michael
Article2005
Bifurcating DNS Thresholds in a Model of Organizational Bridge Building

A simple optimal control model is introduced, where "bridge building" positions are rewarded. The optimal solutions can be classified in regards of the two extern parameters, (1) costs for the control staying at such an exposed position and (2) the discount rate. A complete analytical description of the bifurcation lines in parameter space is derived, which separates regions with different optimal behavior. These are resisting the influence from inner and outer forces, always fall off from the boundaries or decide based on one’s initial state. This latter case gives rise to the emergence of so-called Dechert-Nishimura-Skiba (DNS) points describing optimal solution strategies. Furthermore the bifurcation from a single DNS point into two DNS points has been analyzed in parameter space. All these strategies have a funded interpretation within the limits of the model.

Caulkins, Jonathan
Working Paper2005
Brand Image and Brand Dilution in the Fashion Industry

This paper develops dynamic optimal control model of a fashion designer's challenge of maintaining brand image in the face of short-term profit opportunities through expanded sales that risk brand dilution in the longer-run. The key state variable is the brand's reputation, and the key decision is sales volume. Depending on the brand's capacity to command higher prices, one of two regimes is observed. If the price mark-ups relative to production costs are modest, then the optimal solution may simply be to exploit whatever value can be derived from the brand in the short-run and retire the brand when that capacity is fully diluted. However, if the price markups are more substantial, then an existing brand should be preserved. It may even be worth incurring short-term losses while increasing the brand's reputation, even if starting a new brand name from scratch is not optimal.

Caulkins, Jonathan
Working Paper2005
Can a Spatial Decision Support System Improve Low-Income Service Delivery? Analysis of Tools and Requirements for a Computer-Assisted Mobility Counseling System

Assisted housing counseling can enable clients to choose affordable housing in opportunity-rich communities. However, the quality of such services may vary widely. In this paper a framework is presented for developing a spatial decision support system (SDSS) for housing mobility counseling and present evidence regarding the capacity of assisted housing clients to make productive use of such a SDSS. It can be concluded that there are significant opportunities for, and social returns to, research on information technology-enabled housing and neighborhood choice.

Johnson, Michael
Working Paper2005
Can Housing Mobility Programs Make a Long-Term Impact on the Lives of Poor Families and the Health of Middle-Class Communities: A Policy Simulation

Housing mobility programs enable families living in high-poverty neighborhoods to relocate to lower- poverty neighborhoods using tenant-based subsidies. Recent research indicates that these programs improve participant outcomes on a number of economic and social outcomes. This paper applies policy simulation to a stylized representation of a housing mobility program to give a sense of scale and proportion for what a "full scale" mobility program might entail. Results indicate that this system model reaches steady-state fairly quickly, that rates of concentrated poverty decrease more quickly than those for system-wide poverty, consistent with the notion of a housing mobility program as primarily a tool for poverty deconcentration. Destination communities absorb a substantial number of mobility in- movers without suffering substantial adverse demographic impacts, indicating that the "carrying capacity" of these communities may sufficient to support large-scale mobility initiatives. Middle-class flight per mobility family is moderately high and almost independent of housing mobility program intensity; selected sprawl-related social costs are relatively small. Sensitivity analyses show that the model behaves in predictable ways in response to changes to structural parameters. A "worst-case" scenario of parameter values still generates modest poverty reductions with moderate levels of poverty in destination communities but very high rates of middle-class "flight".
 

Caulkins, Jonathan
Johnson, Michael
Working Paper2005
Digital Demand: Demand for Digital Cameras on eBay

The paper estimates the demand for new digital cameras sold in eBay auctions. EBay data seems to offer significant advantages over traditional transactions data for estimating demand for differentiated products. However, there are a number of concerns including censoring bias and the interpretation of the bidding behavior. This paper discusses these problems as well as possible solutions. The paper presents results from three different methods for estimating demand for differentiated products on eBay. The results suggest that the demand for digital cameras is highly elastic and there isn’t a lot of substitution, particularly across brands.

Vogt, William
Working Paper2005
Disclosure Risk vs. Data Utility through the R-U Confidentiality Map in Multivariate Settings

Information organizations, such as statistical agencies, must ensure that data access does not compromise the confidentiality afforded data providers, whether individuals or establishments. Recognizing that deidentification of data is generally inadequate to protect confidentiality against attack by a data snooper, information organizations (IOs)—such as statistical agencies, data archives, and trade associations—can implement a variety of disclosure limitation (DL) techniques—such as topcoding, noise addition and data swapping—in developing data products. Desirably, the resulting restricted data have both high data utility U to data users and low disclosure risk R from data snoopers. IOs lack a framework for examining tradeoffs between R and U under a specific DL procedure. They also lack systematic ways of comparing the performance of distinct DL procedures. To provide this framework and facilitate comparisons, the R-U confidentiality map is introduced to trace the joint impact on R and U to changes in the parameters of a DL procedure. Implementation of an R-U confidentiality map is illustrated in the case of multivariate noise addition. Analysis is provided for two important multivariate estimation problems: a data user seeks to estimate linear combinations of means and to estimate regression coefficients.

Duncan, George
Working Paper2005
Distortion of Outcome and Probability Information in Risky Decisions

Substantial evidence indicates that information is distorted during decision making. However, no studies have assessed the distortion of outcome and probability information in risky decisions or the effects of ambiguity on information distortion. This paper reports two studies involving six binary decisions (e.g., banning blood donations from people who have visited England, because of "mad cow disease"). In Study 1, participants distorted their evaluations of outcome and probability information in the direction of their preferred decision alternative and used these biased evaluations to update their preferences. Participants also evaluated the utilities of possible outcomes more positively when the outcomes could follow only from the preferred alternative and more negatively when they could follow only from the competing alternative. In Study 2, we manipulated ambiguity by describing outcomes and probabilities using either point estimates or ranges of values. Results replicated those of Study 1, with no effects of ambiguity on information distortion.

DeKay, Michael
Working Paper2005
Does the Profit Motive Make Jack Nimble? Ownership Form and the Evolution of the U.S. Hospital Industry

This paper examines the evolving structure of the U.S. hospital industry since 1970, focusing on how ownership form influences entry and exit behavior. We develop theoretical predictions based on the model of Lakdawalla and Philipson, in which for-profit and not-for-profit hospitals differ regarding their objectives and costs of capital. The model predicts for-profits would be quicker to enter and exit than not-for-profits in response to changing market conditions. We test this hypothesis using data for all U.S. hospitals from 1984 through 2000. Examining annual and regional entry and exit rates, for-profit hospitals consistently have higher entry and exit rates than not-for-profits. Econometric modeling of entry and exit rates yields similar patterns. Estimates of an ordered probit model of entry indicate that entry is more responsive to demand changes for for-profit than not-for-profit hospitals. Estimates of a discrete hazard model for exit similarly indicate that negative demand shifts increase the probability of exit more for for-profits than not-for-profits. Finally, membership in a hospital chain significantly decreases the probability of exit for for-profits, but not not-for-profits.
 

Gaynor, Martin
Klepper, Steven
Vogt, William
Working Paper2005
Evaluating a Deliberative Method for Ranking Environmental Risks in China

Previous research at Carnegie Mellon University introduced a systematic method for public participation in risk ranking. The method has been tested successfully with participants from the United States using school risks and environmental risks. To explore the viability of the method in another cultural setting, it was tested with Chinese participants, because previous research has shown that Americans and Chinese differ substantially along many dimensions of cognition and social organization relevant to decision making. Using 10 environmental hazards based on current environmental regulatory programs in a Chinese city as the risk domain, 5 groups of 8-9 participants ranked the hazards using both holistic and multiattribute approaches. Resulting judgment patterns for the Chinese participants were consistent with those observed in previous studies with Americans, providing additional evidence for the robustness of the method. Risk rankings from the holistic and multiattribute approaches were reasonably consistent, both for individuals and for groups. Explicit and implicit measures indicated that participants were satisfied with the procedures and resulting rankings. Results for Chinese participants were compared with previous results for Americans to look for cultural effects involving risk perception, deference to technical expertise and quantitative analysis, deference to senior group members, adherence to groups’ rankings because of collectivism, and skepticism toward public participation in policy making. Although there were some hints of cultural differences, no cultural effect had a substantial effect on measures of the method’s validity or replicability. Because the Carnegie Mellon method offers a scientifically sound and measured approach to legitimate public involvement, it may be attractive to Chinese leaders as they respond to growing demand for public participation in risk-management policy.

DeKay, Michael
Working Paper2005
Explaining Fashion Cycles: Imitators Chasing Innovators in Product Space

This paper considers the problem of a fashion trend-setter confronting an imitator who can produce the same product at lower cost. A one- dimensional product space is considered, which is an abstraction of the key attribute of some consumer good. Three broad strategies can be optimal for the fashion-leader: (1) Never innovate; milk profits from the initially advantageous position but ulti- mately concede the market without a fight. (2) Innovate once but only once, which just temporarily defers conceding the market. (3) Cycle in- finitely around product space, never letting the imitator catch up and capture the market. Sometimes the cycles start immediately; sometimes the innovator should wait for a time before beginning the cycles. The optimal solution exhibits strong state-dependency, with so-called Skiba curves separating regions in state space where various of these strategies are optimal. There are even instances of intersecting Skiba curves. In most cases, analytical expressions can be stated that characterize these Skiba curves.

Caulkins, Jonathan
Working Paper2005
Flexible Affordable Housing Policy Design using Facility Location Models

Affordable and subsidized housing providers must design and implement housing strategies: where, when and with what types of housing to best meet the needs of low- and moderate-income households for affordable/low-cost permanent shelter. Recent research has proposed a multiobjective integer programming model for this purpose that jointly optimizes measures of net social benefit and equity and addresses limited variations in housing characteristics. This paper extends the affordable housing planning model to better reflect current research and practice in affordable housing and better meet the needs of affordable housing providers.
 

Johnson, Michael
Working Paper2005
High and Low Frequency Oscillations in Drug Epidemics

This paper extends the two-dimensional model of drug use introduced in Behrens et al. [1999, 2000, 2002] by introducing two additional states that model in more detail newly initiated ("light") users’ response to the drug experience. Those who dislike the drug quickly "quit" and briefly suppress initiation by others. Those who like the drug progress to ongoing ("moderate") use, from which they may or may not escalate to "heavy" or dependent use. Initiation is spread contagiously by light and moderate users, but is moderated by the drug’s reputation, which is a function of the number of unhappy users (recent quitters + heavy users). The model reproduces recent prevalence data from the U.S. cocaine epidemic reasonably well, with one pronounced peak followed by decay toward a steady state. However, minor variation in parameter values yields both long-run periodicity with a period akin to the gap between the first U.S. cocaine epidemic (peak ~1910) and the current one (peak ~1980), as well as short-run periodicity akin to that observed in data on youthful use for a variety of substances. The combination of short- and long-run periodicity is reminiscent of the elliptical burstors described by Rubin and Terman [2002]. The existence of such complex behavior including cycles, quasi periodic solutions, and chaos is proven by means of bifurcation analysis.
 

Caulkins, Jonathan
Working Paper2005
How studies of the cost-of-illness of substance abuse can be made more useful for policy analysis

An elementary step in drug policy analysis is comparing the cost of an intervention to its benefit in the form of the social cost averted because of reduced drug use and associated consequences. One would think that cost of illness (COI) studies would provide a solid foundation for quantifying the benefits of reduced drug use, but at present they do not. This paper suggests ways the COI studies could be adapted to serve better policy analytic purposes.

Caulkins, Jonathan
Working Paper2005
Illicit Drug Markets and Economic Irregularities

Markets for illicit drugs present an interesting case study for economics, combining non-standard characteristics such as addiction and product illegality. One response has been to argue the generality of economic principles by suggesting that they apply even in the extreme case of markets for addictive substances, e.g., by showing that demand for illicit goods is responsive to price [1] and even by modeling addiction as rational [2]. This paper sketches examples of an alternative reaction, focusing on idiosyncrasies of drug markets that might plausibly create counter-intuitive effects, including supply curves that slope downward because of enforcement swamping and/or a good serving as the only available store of wealth for its producer, demand reduction programs that increase demand, and consumption by “jugglers” possibly increasing rather than decreasing as prices rise. This analysis yields non-obvious policy recommendations; for example, source country control programs should concentrate on growing regions with a healthy banking sector.

Caulkins, Jonathan
Working Paper2005
Impact of Software Vulnerability Announcements on the Market Value of Software Vendors - An Empirical Investigation

Researchers in the area of information security have mainly been concerned with tools, techniques and policies that firms can use to protect themselves against security breaches. However, information security is as much about security software as it is about secure software. Software is not secure when it has defects or flaws which can be exploited by hackers to cause attacks such as unauthorized intrusion or denial of service attacks. Any public announcement about a software defect is termed as ‘vulnerability disclosure’. Although research in software economics have studied firms’ incentive to improve overall quality, there have been no studies to show that software vendors have an incentive to invest in building more secure software. This paper uses the event study methodology to examine the role that financial markets play in determining software vendors’ incentives to build more secure software. Data is collected from leading national newspapers and industry sources like CERT by searching for reports on published software vulnerabilities. It is shown that vulnerability disclosures lead to a negative and significant change in market value for a software vendor. On average, a vendor loses around 0.6% value in stock price when a vulnerability is reported. This is equivalent to a loss in market capitalization values of $0.86 billion per vulnerability announcement. To provide further insight, the  information content of the disclosure announcement is used to classify vulnerabilities into various types.

Telang, Rahul
Wattal, Sunil
Working Paper2005
In Search of a Unified Theory for Early Predictive Design Evaluation for Software

Traditional engineering design discipline calls for designs to be evaluated long before they are implemented. Early design evaluations predict properties of the artifact that will result from a proper implementation of the design and the value of those properties to the client or end user. The predicted properties can include costs as well as functionality, performance, and quality measures. Software engineering has some such evaluation techniques but the discipline lacks a systematic way to explain, compare, develop, and apply them. This paper discuss the role of early predictive design evaluation in software design, show how a variety of specific predictors serve this role, and propose a unifying framework, Predictive Analysis for Design (PAD) for design evaluation techniques. Focus is given in techniques that predict the value of the finished software system to its client or end user and that make the predictions before the expense of software development or integration is incurred. It is shown that the PAD framework, even in its preliminary state, is sufficiently expressive to be useful in explaining and characterizing design evaluation techniques.

Arora, Ashish
Working Paper2005
Internet Exchanges for Used Books: An Empirical Analysis of Product Cannibalization and Welfare Impact

Information systems and the Internet have facilitated the creation of used product markets that feature a dramatically wider selection, lower search costs, and lower prices than their brick-andmortar counterparts do. The increased viability of these used product markets has caused concern among content creators and distributors, notably the Book Publishers Association and Author’s Guild, who believe that used product markets will significantly cannibalize new product sales. However, this proposition, while theoretically possible, is based on speculation as opposed to empirical evidence. In this paper, we empirically analyze the degree to which used products cannibalize new product sales for books - one of the most prominent used product categories sold online. To do this, we use a unique dataset collected from Amazon.com’s new and used book marketplaces to measure the degree to which used products cannibalize new product sales. We then use these estimates to measure the resulting first-order changes in publisher welfare and consumer surplus.

Smith, Kathleen
Telang, Rahul
Working Paper2005
Is Academic Science Driving a Surge in Industrial Innovation? Evidence From Patent Citations

What is driving the remarkable increase over the last decade in the propensity of patents to cite academic science? Does this trend indicate that stronger knowledge spillovers from academia have helped power the surge in innovative activity in the U.S. in the 1990s? This paper seeks to shed light on these questions by using a common empirical framework to assess the relative importance of various alternative hypotheses in explaining the growth in patent citations to science. Our analysis supports the notion that the nature of U.S. inventive activity has changed over the sample period, with an increased emphasis on the use of the knowledge generated by university-based scientists in later years. However, the concentration of patent-to-paper citation activity within what we call the "bio nexus" suggests that much of the contribution of knowledge spillovers from academia may belargely confined to bioscience-related inventions.

Branstetter, Lee
Working Paper2005
Long-Run Trends in Incarceration of Drug Offenders in the US

Estimates are developed for the number of people incarcerated in the US for drug-law violations between 1972-2002, broken down by type of institution (federal prison, state prison, or jail) and to the extent possible by nature of drug offense (possession/use, trafficking, or other). These time series are compared to trends in drug use indicators, revealing at best weak correlations, and the absolute levels are compared to different market indicators to draw various inferences. For example, even though about 480,000 people are incarcerated for drug-law violations, on average retail sellers spend less than two hours behind bars per sale. Still, full time sellers might expect to spend three months incarcerated per year of selling, suggesting that there are roughly four active drug sellers for every one who is incarcerated.

Caulkins, Jonathan
Working Paper2005
Marijuana Markets: Inferences from Reports by the Household Population

Generally more is known about drug use and demand than about markets and supply, in large part because population survey data are available while market data are not. Although the household population represents a relatively small proportion of users of hard drugs, it represents a large proportion of the population using marijuana and participating in marijuana markets. This paper provides a description of marijuana market and acquisition patterns as reported by participants in the 2001 National Household Survey on Drug Abuse. We find that most respondents obtain marijuana indoors (87%), from a friend or relative (89%), and for free (58%). Retail marijuana distribution appears to be embedded in social networks, rather than being dominated by "professional" sellers. Despite these contrasts with stereotypical street markets for cocaine and heroin, there are also similarities, such as evidence of quantity discounts and a minority of users accounting for the majority of purchases. It is estimated that there are on the order of 400 million retail marijuana purchases in the U.S. each year and that the average purchase size is small, about 6-7 joints.
 

Caulkins, Jonathan
Working Paper2005
Modelling the spread of hepatitis C via commercial tattoo parlours: Implications for public health interventions

Hepatitis C (HCV) is a serious infection caused by a blood-borne virus. It is a contagious disease spreading rapidly via a variety of transmission mechanisms including contaminated tattoo equipment. Effectively regulating commercial tattoo parlours can greatly reduce this risk. This paper models the cost-effectiveness and optimal timing of such interventions, and parameterizes the model with data for Vienna, Austria. This dynamic model of the contagious spread of HCV via tattooing and other mechanisms accounts for secondary infections and shows that regulation can be highly cost-effective.
 

Caulkins, Jonathan
Working Paper2005
Operations Research & Public Policy for Africa: Harnessing the Revolution in Management Science Instruction

Operations research (OR) has made major contributions in the developed world to public policy domains that are of great relevance to Africa. Inasmuch as OR has failed to live up to its potential for addressing such issues in Africa, a principal barrier may have been distance between OR analysts and decision makers. However, the revolution in management science instruction and potential to train end user modelers has democratized OR. This makes training for policy makers and mangers in the public and non-profit sectors in Africa both feasible and highly beneficial. Existing Management Science courses for public and non-profit leaders, such as those that taught at Carnegie Mellon, could be adapted to fit the needs of educators and policy makers in Africa and disseminated via a “train the trainers” approach. A plan is sketched whereby 800,000 end-user modelers might be trained in Africa (1 for every 1,000 people) at an annual cost of about $5M per year. Such budgets are well within the range of investments in human capital formation currently being made in Africa.

Caulkins, Jonathan
Working Paper2005
Optimal Policy for Software Vulnerability Disclosure

Software vulnerabilities represent a serious threat: most cyber-attacks exploit known vulnerabilities. Unfortunately, there is no agreed-upon policy for their disclosure - white-hats who discover vulnerabilities, security mailing lists and CERT follow different ad-hoc policies. This paper develops a framework to analyze the optimal timing of disclosure policy (time given to vendor to patch the vulnerability). Disclosure policy indirectly affects how the speed and quality of the patch that a vendor develops, and thus CERT and similar bodies acting in the public interest can use it to influence behavior of vendors and reduce social cost. This paper formulates a game-theoretic model involving a social planner who sets disclosure policy and a vendor who decides on patching. It is shown that vendors always choose to patch later than a socially optimal disclosure time. The social planner can optimally shrink the time window of disclosure to push vendors to deliver patch in a timely manner. The basic model is extended in a number of directions, most importantly, allowing for the proportion of users implementing patches to depend upon the quality of the patch, which is itself a choice variable for the vendor. The paper provides a decision framework for understanding how disclosure timing may affect vendor’s decision and in turn, what should a policy maker do.
 

Arora, Ashish
Telang, Rahul
Working Paper2005
Price and Purity Analysis for Illicit Drug: Data and Conceptual Issues

Data on illicit drug purity and prices are invaluable but problematic. Purists argue they are unsuitable for economic analysis (Manski et al., 2001; Horowitz, 2001), but in reality they are used frequently (ONDCP 2001a, 2001b, 2004; Grossman, 2004). This paper reviews data and conceptual issues that people producing, analyzing, and consuming drug price and purity series should understand in order to reduce the likelihood of misinterpretation. It also identifies aspects of drug markets that are both poorly understood and relevant to some of these issues. They constitute a useful research agenda for health and law enforcement communities who would benefit from better data on the supply, availability, and use of illicit drugs.

Caulkins, Jonathan
Working Paper2005
Quality Cycles and the Strategic Manipulation of Value

Quality Cycles and the Strategic Manipulation of Value

Applicable, Not
Working Paper2005
Sensitivity of MRQAP Tests to Collinearity and Autocorrelation

MRQAP (Multiple Regression { Quadratic Assignment Procedure) tests are permutation tests for multiple regression coefficients for data organized in square matrices instead of vectors. Such a data structure is typical in social network studies, where variables indicate some type of relation between a given set of actors. Over the last 15 years, new approaches to permutation tests have been developed. Some of the proposed tests have been found to be substantially more robust against collinearity in the data. Most studies evaluating the performance of permutation tests in linear models for square matrices do not consider the type of structural autocorrelation that is typical for social network data. This paper presents a new permutation method that complements the family of extant tests. Performance of various different approaches to MRQAP tests is evaluated under conditions of row and column autocorrelation in the data as well as collinearity between the variables through an extensive series of simulations.

Krackhardt, David
Working Paper2005
Service Adoption and Pricing of Content Delivery Network (CDN) Services

Content Delivery Networks (CDNs) are a vital component of the Internet’s content delivery value chain, servicing nearly a third of the Internet’s most popular content sites. However, in spite of their strategic importance little is known about the optimal pricing policies or adoption drivers of CDNs. We address these questions using analytic models of the market structure for Internet content delivery. This paper finds that, consistent with industry practices, CDNs should provide volume discounts to content providers when traffic burstiness is similar across content providers. However, when different content providers have varying traffic burstiness, as expected in reality, CDNs should provide relatively lower volume discounts, even leading to convex price functions in some cases. Surprisingly, it is also found that content providers with bursty traffic provision less infrastructure compared to those with lower burstiness, that CDNs are able to charge more in the presence of bursty traffic, and that content providers with bursty traffic realize lower surplus. Similarly, it is found that a pricing policy that accounts for both the mean and variance in traffic such as percentile- based pricing does better than pure volume based pricing. Finally, it is shown that larger CDN networks can charge higher prices in equilibrium, strengthening any technology-based economies of scale.

Smith, Michael
Working Paper2005
Social Network Analysis

Richards, Seth. 2005. “A Social Network Analysis into the David Kelly Tragedy.” Connections, 26(2): 25-32.

Richards-Shubik, Seth
Article2005
Spreadsheet Errors and Decision Making:

There is consensus in the literature that spreadsheets are both ubiquitous and error-prone, but little direct evidence concerning whether spreadsheet errors frequently lead to bad decision making. As part of research, 45 executives and senior managers/analysts in the private, public, and non-profit sectors were interviewed about their experiences with spreadsheet errors and quality control procedures. Differences across sectors do not seem pronounced. Almost all respondents report that spreadsheet errors are common. Most can report instances in which the errors directly led to losses or bad decisions, but opinions differ as to whether the consequences of spreadsheet errors are severe. Error checking and quality control procedures are in most cases informal. A significant minority of respondents believe such ad hoc processes are sufficient because the "human in the loop" can detect any gross errors. Others thought more formal spreadsheet quality control processes could be beneficial.
 

Caulkins, Jonathan
Working Paper2005
The Need for Dynamic Drug Policy

Drug use in a population varies dramatically over time in no small measure due to nonlinear feedback among factors endogenous to the drug system. This suggests that drug policy ought likewise to be dynamic, varying the mix of strategies over time as drug use waxes and wanes. A growing literature that models drug "epidemics" mathematically supports this hypothesis and offers perspectives that may break policy logjams. For example, supply control may be most effective early, in the explosive growth stage of an epidemic. Conversely, treatment and measures to mitigate the consequences of dependent use and flagrant drug markets may have their comparative advantage later, in the endemic stage. Fully harnessing the power of dynamic drug policy will require more research and collection of new types of data, but the promise is worth the effort.
 

Caulkins, Jonathan
Working Paper2005
An Empirical Analysis of Cellular Voice and Data Services

Cellular telephony and associated data services has been a major social phenomena for well over a decade now. It has changed the way - in some countries more than others - in which people communicate. In many countries in Northern Europe and Asia, its penetration rates are very high and in others less so but in all cases it has engendered change at multiple levels - socially as noted and in terms of market structure and competition with the established Incumbent Local Exchange and Inter Exchange service providers. However, there has been little work published in the academic literature on user consumption of cellular voice and data services. This has been due to the unavailability of longitudinal data at the individual user level on their consumption of voice and data services. We have such data from a large cellular service provider in Asia. Demand for voice and data services is influenced by the tariffs or "service plans" offered by firms. In our analysis we empirically estimate the drivers for cellular services how demographic and plan characteristics affect the user choices. We first provide a theoretical model and then provide insight into consumption patterns over a one year period of cellular voice and data services and relate it to service plan design.

Telang, Rahul
Working Paper2004
Are Medical Treatments for Individuals and Groups Like Single-Play and Multiple-Play Gambles?

Are Medical Treatments for Individuals and Groups Like Single-Play and Multiple-Play Gambles?

DeKay, Michael
Working Paper2004
Decision Support Technology for Public Safety Resource Allocation: Location of Fire Stations in a Fiscally Constrained Environment

The City of Pittsburgh is facing a severe financial crisis and seeks strategies to reduce expenditures while maintaining an acceptable quality of services. Currently the Bureau of Fire accounts for nearly twenty percent of the City’s expenses, and evidence from cities of similar size to Pittsburgh suggests that fire service expenditures may be one source of fiscal economies. This report represents the culmination of efforts to design and implement an information system to aid City decision makers in designing policy alternatives for fire services design. This decision support methodology generates service characteristics for existing and proposed station configurations of Bureau of Fire services in the City of Pittsburgh. Additionally, this methodology develops alternative station configurations that optimize stated goals of the decision makers.

Johnson, Michael
Wendholt, Amy
Working Paper2004
Does in-house R&D increase bargaining power? Evidence from the pharmaceutical industry

According to Gans & Stern (1999), firms engage in R&D spending, in part, in order to improve their bargaining position as buyers in the market for technology. This theory is tested empirically with data from the pharmaceutical industry. We develop and estimate a structural model of R&D spending and licensing. We find that R&D spending does improve the bargaining position of licensees; although, the effect is small. In the absence of the bargaining power effect, spending on R&D would be about 6% lower than it is. We also find that entry of technology licensors reduces firms’ own R&D but has a positive overall effect on innovation.
 

Arora, Ashish
Vogt, William
Working Paper2004
Framework for Validating Geographic Profiling

This short paper was prepared for the NIJ Roundtable for Developing an Evaluation Methodology for Geographic Profiling Software (August 10 and 11, 2004) on approaches for validating geographic profiling (GP) methods. The paper presents a framework for validating any GP method or software package using solved serial crimes including data on crime locations and criminal residences or other anchor points (e.g., work location, girl friend’s residence, etc.). Findings from the literature and through analysis include: 1) the appropriate performance measure for GP (that matches policing needs and as extended in this paper) concerns prioritizing relevant areas for investigation; 2) future work should correct the performance measure of GP by excluding irrelevant areas from consideration such as rivers, lakes, cemeteries, etc. (past studies apparently did not do this); 3) additional model parameters may be able to be estimated in empirical studies such as the amount to expand the search area for a serial criminal beyond the minimum rectangle or other boundary enclosing crime sites; and 4) future validation studies for GP should compare alternative models, including simple models for benchmarking, and use holdout samples in a resampling scheme for validating performance. Acknowledgements
 

Gorr, Wilpen
Working Paper2004
Hidden Strategic Challenges Posed by Housing Mobility Policy: An Application of Dynamic Policy Modeling

Over the past decade, shifts in subsidized and affordable housing policy have led to a greater role for market dynamics and individual choice on the part of program participants and their new neighbors, and a greater awareness of the importance of neighborhood on family outcomes. Given these trends, there is an opportunity for innovative prescriptive planning models to assist in the design of policy related to regional housing mobility. The goal of this paper is to identify, and answer, some housing policy analytic questions with these models.

Caulkins, Jonathan
Johnson, Michael
Working Paper2004
Leading Indicators and Spatial Interactions: A Crime Forecasting Model for Proactive Police Deployment

Based on crime attractor and displacement theories of environmental criminology, this paper specifies a leading indicator model for forecasting serious property and violent crimes. The model, intended for support of tactical deployment of police resources, is at the micro-level scale; namely, one-month-ahead forecasts over a grid system of 104 square grid cells 4,000 feet on a side (with approximately 100 blocks per grid cell). The leading indicators are selected lesser crimes and incivilities entering the model in two ways: 1) as time lags within grid cells and 2) time and space lags averaged over contiguous grid cells of observation grid cells. The validation case study uses 1.3 million police records including 16 individual crime types from Pittsburgh, Pennsylvania aggregated over the grid system for a 96 month period ending in December 1998. The study uses the rolling-horizon forecast experimental design with forecasts made over the 36 month period ending in December 1998, yielding 3,774 forecast errors per forecast model.

Cohen, Jacqueline
Gorr, Wilpen
Working Paper2004
Neighborhood Selection of Public Housing Residents in the Housing Choice Voucher Program: Quasi-Experimental Results from Chicago

Millions of families are supported nationwide by housing subsidies which have traditionally tied them to a place - a public housing unit. Based on promising results from Gautreaux program and midterm evaluation of national MTO experiment, it is deemed reasonable by the housing policy researchers to relocate all households in public housing projects to rent-subsidized units in open market via Housing Choice Voucher Program (HCVP). Policy design based on voucher-based housing subsidies requires detailed knowledge of the preferences and relocation choices of subsidy recipients. However, little is known about these decision processes. In this paper we seek to understand the intricacies of family relocation decision given an opportunity to use housing vouchers. Data for this study are derived from an initiative in Chicago dating from 1997 in which families on the waiting list for the then "Section 8" program were purged in a management review and the waiting lists repopulated. Families on the waiting list were then chosen to receive Section 8 vouchers via a lottery. This paper uses a logit model to identify correlates of destination outcomes. Unlike most studies on housing relocation, our choice set is comprised of census tracts, based on a belief that families originating in public housing have limited exposure of distant destinations and therefore do not take decisions based on larger aggregates. It is found that a focus on Census tracts leads to large but tractable models, and that most households relocate to tracts close to their origin address. It is also found that age, sex and employment among others affect the relocation choice.
 

Johnson, Michael
Working Paper2004
Patent protection, complementary assets and firms' incentive for technology licensing

Technology transactions, such as licensing and R&D based alliances, have been growing rapidly in recent years. Even as technology licensing has grown, so has patenting. Both trends foreshadow possibly profound changes in firms’ strategy. In this paper, we develop a simple structural model in which both patenting and licensing are jointly determined by factors such as patent effectiveness, the presence and strength of commercialization capabilities and their complementarity with R&D activity, and industry and technology characteristics, such as the nature of knowledge and the degree of technological competition. This paper estimates the model using the 1994 Carnegie Mellon survey on industrial R&D, which provides detailed information on the patenting and licensing activities of manufacturing firms in the U.S. A key feature of the model is that it naturally implies that the impact of patent effectiveness on licensing behavior will be conditioned by commercialization capabilities. It is found that increases in patent effectiveness increase both patenting and licensing propensity. Conditional on patenting, increases in patent effectiveness decreases licensing propensity. However, higher patent effectiveness elicits much larger increases in licensing from firms lacking commercialization capability or characterized by a lower degree of complementarities between the R&D and marketing or production functions.

Arora, Ashish
Working Paper2004
Policy Systems: The Integration of Information Technology into Policy Analysis, Planning, and Program Analysis

This paper defines a policy system to be a collection of hardware, software, communication technologies, persons, procedures, protocols, and standards driven by and for the purpose of advancing a public organization’s mission in regard to policy analysis, planning, and program evaluation decisions. While policy systems already exist in practice, they have not been identified and studied as a separate, distinguishable area of information systems. They have components and patterns of use that could benefit governments of all levels in carrying out policy making. This paper proposes principles for building policy systems, identify their components, discuss how they address the complexities of policy making, illustrate them with several examples including a policy system built for a local government agency, and distinguish them from related systems such as management information systems, decision support systems, and collaboratories.

Gorr, Wilpen
Johnson, Michael
Roehrig, Stephen
Working Paper2004
Tabu Search Enhanced Markov Blanket Classifier for High Dimensional Data Sets

Data sets with many discrete variables and relatively few cases arise in health care, ecommerce, national security, and many other domains. Learning effective and efficient prediction models from such data sets is a challenging task. This paper proposes a Tabu Search enhanced Markov Blanket (TS/MB) procedure to learn a graphical Markov Blanket classifier from data. The TS/MB procedure is based on the use of restricted neighborhoods in a general Bayesian network constrained by the Markov condition, called Markov Equivalent Neighborhoods. Computational results from real world data sets drawn from health care domain indicate that the TS/MB procedure converges fast, is able to find a parsimonious model with substantially fewer predictor variables than in the full data sets, gives comparable or better prediction performance when compared against several machine learning methods, and provides insight into possible causal relations among the variables.

Padman, Rema
Working Paper2004
The Globalization of the Software Industry: Perspectives and Opportunities for Developed and Developing Countries

The spectacular growth of the software industry in some non-G7 economies has aroused both interest and concern. This paper addresses two sets of inter-related issues. First, the determinants of these successful stories are explored. This paper then touches upon the broader question of what lessons, if any, can be drawn from for economic development more generally. Finally, examining the long term implications of offshoring of software, it is concluded that it is unlikely to pose a long term threat to American technological leadership. Instead, the U.S. economy will broadly benefit from the growth of new software producing regions. The U.S. technological leadership rests in part upon the continued position of the U.S. as the primary destination for highly trained and skilled scientists and engineers from the world over. Though this is likely to persist for some time the increasing attractiveness of foreign emerging economy destinations is a long-term concern for continued U.S. technological leadership.

Arora, Ashish
Working Paper2004
When Things Don't Add Up: The Role of Perceived Fungibility in Repeated-Play Decisions

Previous research on repeated-play decisions has focused on choices with fungible outcomes. In two studies, we investigated the perceived fungibility of outcomes over repeated plays of risky prospects in a variety of situations, as well as the relationship between perceived fungibility and preferences for taking risks in those situations. Perceived fungibility varied substantially across participants and situations, with outcomes experienced by different people (e.g., medical outcomes for different patients) receiving lower scores. Higher perceived fungibility was associated with more favorable evaluations of repeated plays of risky prospects with positive expectations. Additionally, perceived fungibility moderated the effect of repetition, such that the increased attractiveness of repeated plays relative to a single play was diminished when perceived fungibility was low. Although evaluating the overall distribution of outcomes is arguably rational when monetary outcomes accrue to one person, treating each play as a separate event may be rational when aggregation is considered inappropriate.

DeKay, Michael
Working Paper2004
An Integrated Approach to Developing Human Services Web Portals

A major goal for Human Services Web Portals is to make as much expertise available as possible for clients and their caregivers. The expertise covers three main areas - diagnosing a client’s problem, identifying available resources for solution, and finally, providing assistance to package these resources into a service plan that will serve as a solution for the client’s problem. The main challenge in setting up a human services web portal lies in the nature of diversity and complexity in both the client set and the set of problems clients want to address. Therefore offering prepackaged solutions is not an option. We describe an integrated human services web portal design and provide a phased approach for implementation. Finally, we generalize our design for other domains in which external expertise is required as a component of service delivery.

Gorr, Wilpen
Working Paper2003
Application of Tracking Signals to Detect Time Series Pattern Changes in Crime Mapping Systems

Tracking signals are widely used in industry to monitor inventory and sales demand. These signals automatically and quickly detect departures in product demand, such as step jumps and outliers, from "business-as-usual". This paper explores the application of tracking signals for use in crime mapping to automatically identify areas that are experiencing changes in crime patterns and thus may need police intervention.. Detecting such changes through visual examination of time series plots, while effective, creates too large a work load for crime analysts, easily on the order of 1,000 time series per month for mediumsized cities. It is demonstrated that the so-called smoothed-error-term tracking signal and carry out an exploratory validation on 10 grid cells for Pittsburgh, Pennsylvania. Underlying the tracking signal is an extrapolative forecast that serves as the counterfactual basis of comparison. The approach to validation is based on the assumption that we wish tracking signal behavior to match decisions made by crime analysts on identifying crime pattern changes. Tracking signals are presented in the context of crime early warning systems that provide wide area scanning for crime pattern changes and detailed drill-down maps for crime analysis. Based on preliminary results, the tracking signal is a promising tool for crime analysts.

Gorr, Wilpen
Working Paper2003
Disclosure Risk vs. Data Utility: The R-U Confidentiality Map

Recognizing that deidentification of data is generally inadequate to protect their confidentiality against attack by a data snooper, information organizations (IOs) can apply a variety of disclosure limitation (DL) techniques, such as topcoding, noise addition and data swapping. Desirably, the resulting restricted data have both high data utility U to data users and low disclosure risk R from data snoopers. IOs lack a coherent framework for examining tradeoffs between R and U for a specific DL procedure. They also lack systematic ways of comparing the performance of distinct DL procedures. To provide this framework and facilitate comparisons, the R-U confidentiality map is introduced to trace the joint impact on R and U of changes in the parameters of a DL procedure. Implementation of an R-U confidentiality map is illustrated in real multivariate data cases for two DL techniques: topcoding and multivariate noise addition. Topcoding is examined for a Cobb-Douglas regression model, as fit to restricted data from the New York City Housing and Vacancy Survey. Multivariate additive noise is examined under various scenarios of attack, predicated on different knowledge states for a data snooper, and for different goals of a data analyst. We illustrate how simulation methods can be used to implement an empirical R-U confidentiality map, which is suitable for analytically intractable specifications of R, U and the disclosure limitation method. Application is made to the Schools and Staffing Survey, which is conducted by the National Center for Education Statistics.
 

Duncan, George
Working Paper2003
Effect of Information Revelation Policies under Cost uncertainty

Electronic reverse-markets such as those hosted by Freemarkets involve geographically dispersed sellers. By the very nature of the market, sellers in any given market-session are uncertain both about the number of opponents they face and their cost-structure. Over the course of several market sessions, sellers can learn about the competitive structure of the market. Their ability to learn i.e., their ability to reduce the level of uncertainty is dependent on the revelation policy adopted. The extent to which competitive information is revealed under each revelation policy determines what sellers learn, how they bid in future and thus, the consumer surplus generated. This paper compares a set of revelation policies commonly used in electronic reverse marketplaces, using consumer surplus as our metric. Game-theoretic models are employed to focus on the effect of revelation policies when firms are uncertain about their opponent’s cost. Based on the analysis, this paper provide intuitions as to why under certain conditions, one setting is better than the other.

Working Paper2003
Enhancing Aviation Security with The SWIFT System (Short Wait Integrated Flight Travel)

The SWIFT (Short-Wait Integrated Flight Travel) System represents an initiative by Carnegie Mellon University’s H. John Heinz III School of Public Policy and Management to design an airport security system that is both more secure and more efficient. With SWIFT, travelers who volunteer to submit to a security clearance by TSA and pass will receive a “smart card” containing personal and biometric information. SWIFT enrollees will be able to go through a security screening comparable to that which was used prior to September 11, 2001. By screening SWIFT enrollees prior to their arrival at an airport, TSA can focus its resources on more thorough screening of not-cleared individuals. Our research focused on estimating the demand for SWIFT enrollment, design of a reasonable system that included biometric screening of all those carrying a SWIFT card, review of current technological opportunities for screening SWIFT passengers, initial design of a national SWIFT network, identifying processing enhancements of the current system, analyzing costs and benefits of the various improvements through the use of simulation modeling, and designing an initial test implementation of the SWIFT System at the Pittsburgh International Airport (PIT). The results of this research hold promise for creating an airport security system that is markedly more secure and more efficient than the current one.

Working Paper2003
Entry and Competition in Local Hospital Markets

There has been considerable consolidation in the hospital industry in recent years. Over 900 deals occurred from 1994-2000, and many local markets, even in large urban areas, have been reduced to monopolies, duopolies, or triopolies. This surge in consolidation has led to concern about competition in local markets for hospital services. We examine the effect of market structure on competition in local hospital markets - specifically, does the hardness of competition increase with the number of firms? We extend the entry model developed by Bresnahan and Reiss to make use of quantity information, and apply it to data on the U.S. hospital industry. In the hospital markets we examine, entry leads to a quick convergence to competitive conduct. Entry reduces variable profits and increases quantity. Most of the effects of entry come from having a second and a third firm enter the market. The fourth entrant has little estimated effect. The use of quantity information allows us to infer that entry is consumer-surplus-increasing.

Gaynor, Martin
Vogt, William
Working Paper2003
Estimation of Crime Seasonality: A Cross-Sectional Extension to Time Series Classical Decomposition

Reliable estimates of crime seasonality are valuable for law enforcement and crime prevention. Seasonality affects many police decisions from long-term reallocation of uniformed officers across precincts to short-term targeting of patrols for hot spots and serial criminals. This paper shows that crime seasonality is a small-scale, neighborhood-level phenomenon. In contrast, the vast literature on crime seasonality has almost exclusively examined crime data aggregations at the city or even larger scales. Spatial heterogeneity of crime seasonality, however, often gives rise to opposing seasonal patterns in different kinds of neighborhoods, canceling out seasonality at the city-wide level. Thus past estimates of crime seasonality have vastly underestimated the magnitude and impact of the phenomenon. This paper presents a model for crime seasonality that extends classical decomposition of time series based on a multivariate, cross-sectional, fixed-effects model. The crux of the model is an interaction of monthly seasonal dummy variables with five factor scores representing the urban ecology as viewed from the perspective of major crime theories. The urban ecology factors, interacted with monthly seasonal dummy variables, provide neighborhood-level seasonality estimates. A polynomial in time and fixed effects dummy variables for spatial units control for large temporal and spatial variations in crime data. Our results require crime mapping for implementation by police including thematic mapping of next month's forecasted crime levels (which are dominated by seasonal variations) by grid cell or neighborhood, thematic mapping of the urban ecology for developing an understanding of underlying causes of crime, and ability to zoom into neighborhoods to study recent crime points.

Cohen, Jacqueline
Gorr, Wilpen
Working Paper2003
Optimal Bidding in Sequential Online Auctions

Auctions are widely used online to conduct commercial transactions. An important feature of online auctions is that even bidders who intend to buy a single object frequently have the opportunity to bid in sequential auctions selling identical objects. This paper studies key features of the optimal bidding strategy, assuming rational, risk-neutral agents with independent private valuations and sealed-bid second-price sequential auctions. In contrast to previous work on this topic, we develop our theory using the concept of the "option value" of an upcoming auction - a measure of the expected payoff from being able to participate in a future auction. This option value depends, among other things, upon the mean and variance of the future number of bidders. This paper shows an optimal bidding strategy in sequential auctions that incorporates option value assessment. Furthermore, it is establshed that that optimal bidding strategy is tractable since it is independent of the bidding strategies of other bidders in the current auction and is only dependent on the option value assessment.
 

Arora, Ashish
Padman, Rema
Vogt, William
Working Paper2003
Policing Crime Guns

Jacqueline Cohen and Jens Ludwig (2003) "Policing Crime Guns - Research in Brief."  Working Paper.  H.J. Heinz III College, Carnegie Mellon University, Pittsburgh, PA.

Cohen, Jacqueline
Working Paper2003
Pricing Advice: The market for diagnostic information

Diagnostic information helps agents to make more accurate decisions. One such decision is about investing in projects with uncertain outcomes. The value of diagnostic information is the difference in expected payoffs with and without it, and it is shown that such a value is non-monotonic in the ex-ante expected value of the project to be undertaken. This paper analyzes optimal pricing schemes for selling information to buyers with unknown ex ante value. With a monopolist information seller, a striking result is that the optimal menu of contracts is remarkably simple. A pure royalty is offered to buyers whose projects have low ex-ante expected value and a pure fixed fee is offered to buyers whose projects have high ex-ante expected value. This result is robust to the presence of different types of information and to the introduction of competition in the market for diagnostic information.
 

Arora, Ashish
Working Paper2003
Research Tool Patenting and Licensing and Biomedical Innovation

Over the last two decades changes in technology and policy have altered the landscape of drug discovery. These changes have led to concerns that the patent system may be creating difficulties for those trying to do research in biomedical fields. Using interviews and archival data, this paper examines the changes in patenting in recent years and how these have affected innovation in pharmaceuticals and related biotech industries.

 

Arora, Ashish
Working Paper2003
Scalable Payments Netting in Electronic Commerce

Scalable Payments Netting in Electronic Commerce

Working Paper2003
Search and Product Differentiation at an Internet Shopbot

Price dispersion among commodity goods is typically attributed to consumer search costs. This paper explores the magnitude of consumer search benefits and costs using a data set obtained from a major Internet shopbot. For the median consumer, the benefits to searching lower screens are $6.55 while the cost of an exhaustive search of the offers is a maximum of $6.45. This paper also estimates price elasticities and find that they are relatively high compared to offline markets, with a decrease in demand of 7 to 10 percent for each percentage increase in price, in the base model. Interestingly, in this setting, consumers who search more intensively are less price sensitive than other consumers, reflecting their increased weight on retailer differentiation in delivery time and reliability. The results demonstrate that even in this nearly-perfect market of the shopbot, substantial price dispersion can exist in equilibrium from consumers preferences over both price and non-price attributes.

Smith, Michael
Working Paper2003
Sell First Fix Later: Impact of Patching on Software Quality

This paper presents an economic model of fixing or patching a software problem after the product has been released in the market. Specifically, a software firm’s trade-off in releasing a buggy product early and investments in fixing it later is modelled. It is first shown that patching investments and time to enter the market are strategic complements such that higher investments in patching capability allow the firm to enter the market earlier. Just as the marginal cost of producing software can be effectively zero, so can be the marginal cost of repairing multiple copies of defective software by issuing patches. It is shown that due to the fixed cost nature of investments in patching, a vendor has incentives to release a buggier product early and patch it later in a larger market. This result is contrasted with other physical good markets. Thus, it is shown that a monopolist releases a product with fewer bugs but later than what is socially optimal. The model is extended to incorporate duopoly competition and show that in competition, the high value firm always enters earlier than the monopolist. Ironically the firm offering greater value to customers releases a product that initially is of lower quality (more bugs), but provides the greater value by releasing early (so customers can use the product sooner) and by investing more in patching so it can provide better after-sale support to its customers.
 

Arora, Ashish
Caulkins, Jonathan
Telang, Rahul
Working Paper2003
The software industry and India's economic development

This paper assesses the contribution of software to India’s economic development paying particular attention to the role of the software in the absorption of labour and the development of human capital in the Indian economy. India’s specialisation in software has been driven by two sorts of wage advantages that have reinforced each other: the lower wages for Indian software developers relative to that of their US and European counterparts makes Indian software cheaper in global markets, while the higher wages earned by software professionals in India relative to that in other industrial sectors has ensured a steady stream of supply of software professionals. However, the impact of this growth has been limited to a small section of the Indian economy, and there are questions whether the current growth can be sustained without a significant growth of domestic demand. We believe that export led growth is sustainable in the medium term. On the other hand, the success of the software industry has contributed to an increase in the relative value of professional workers - programmers, but also managers and analysts. In turn, the growing importance of human capital has lead to innovative models of entrepreneurship and organisation, pioneered by the software sector, which are slowly taking root and spreading to other parts of Indian industry. A potentially important and under appreciated contribution of the software industry is thus as an exemplar of good entrepreneurship and corporate governance to the rest of Indian industry. Though less visible than the macro contributions to employment and foreign exchange, this role is a source of productivity improvement for all industry, which can have powerful long- term benefits for India’s industrialisation and growth.
 

Arora, Ashish
Working Paper2003
User Acceptance and Adoption of a Clinical Reminder System in Ambulatory Care: A Developmental Trajectory Approach

Evaluation studies of clinical decision support systems (CDSS) have tended to focus on assessment of system quality and clinical performance in a laboratory setting. Relatively few studies have used field trials to determine if CDSSs are likely to be used in routine clinical settings and whether reminders generated are likely to be evaluated by end-users. This paper argues that such beneficial outcomes are not likely to occur if use of the system results in side-effects such as decreased end-user efficiency and unanticipated changes in normal workflows.
 

Engberg, John
Johnson, Michael
Padman, Rema
Working Paper2003
A Field Study of Internet Behavior: Usage Levels and Task Preferences

A Field Study of Internet Behavior: Usage Levels and Task Preferences.

Bajaj, Akhilesh
Working Paper2002
A Model of Chaotic Drug Markets and Their Control

Drug markets are often described informally as being chaotic, and there is a tendency to believe that control efforts can make things worse, not better, at least in some circumstances. This paper explores the idea that such statements might be literally true in a mathematical sense by considering a discrete-time model of populations of drug users and drug sellers for which initiation into either population is a function of relative numbers of both populations. The structure of the system follows that considered in an arms control context by Behrens et al. (1997). In this context, the model suggests that depending on the market parameter values, the uncontrolled system may or may not be chaotic. Static application of either treatment or enforcement applied to a system that is not initially chaotic can make it chaotic and vice versa, but even if static control would create chaos, dynamic controls can be crafted that avoid it. So called OGY controls seem to work well for this example.

Caulkins, Jonathan
Working Paper2002
A Model of Moderation: Finding Skiba Points on a Slippery Slope

A simple model is considered that rewards "moderation" - finding the right balance between sliding down either of two "slippery slopes". Optimal solutions are computed as a function of two key parameters: (1) the cost of resisting the underlying uncontrolled dynamics and (2) the discount rate. Analytical expressions are derived for bifurcation lines separating regions where it is optimal to fight to stay balanced, to give in to the attraction of the "left" or the "right", or to decide based on one’s initial state. The latter case includes situations both with and without so-called Dechert- Nishimura-Skiba (DNS) points defining optimal solution strategies. The model is unusual for having two DNS points in a one-state model, having a single DNS point that bifurcates into two DNS points, and for the ability to explicitly graph regions within which DNS points occur in the 2-D parameter space. The latter helps give intuition and insight concerning conditions under which these interesting points occur.

Caulkins, Jonathan
Working Paper2002
An Age-Structured Single-State Initiation Model -- Cycles of Drug Epidemics and Optimal Prevention Programs

This paper introduces a mode for drug initiation that extends traditional dynamic models by considering explicitely the age distribution of the users. On the basis of a 2-groups model in which the population is split into a user and a non-user group the advantage of a continuous age distribution is shown by considering more details and by yielding new results. Neglecting death rates reduces the model to a single state (1-group) descriptive model which can still simulate some of the complex behavious of drug epidemics such as repeated cycles. Further more, prevention programs, especially school-based programs can be targeted to certain age classes. So in order to discover how best to allocate resources to prevention programs over different age classes we formulate and solve optimal control models.

Caulkins, Jonathan
Working Paper2002
Are Invisible Hands Good Hands? Moral Hazard, Competition, and the 2nd Best in Health Care Markets

The nature and normative properties of competition in health care markets have long been the subject of debate. In this paper we consider what the optimal benchmark is in the presence of moral hazard effects on consumption due to health insurance. Moral hazard is widely recognized as one of the most important distortions in health care markets. In general, economic analysis suggests that marginal-cost pricing leads to static Pareto optimal allocations. In health care markets, however, moral hazard due to health insurance leads to excess consumption, in the sense that insured individuals will consume medical services past the point where the marginal utility of an additional service is equal to its marginal cost (Arrow, 1963; Pauly, 1968). Since health insurance pays for part or all medical expenses, insured individuals face a price that is lower than the market price and consume more of the medical good than is optimal. Therefore it is not obvious that competition or marginal cost pricing is second best optimal given this distortion. The principal claim of this paper is that most of economists’ intuition regarding the welfare effects of price changes in markets not distorted by moral hazard applies quite well to markets where decision-making by consumers is distorted by moral hazard. In particular, lower prices are better for consumers than are higher prices. Furthermore, the gain to consumers from lowering price from supra-marginal cost levels to marginal costs outweighs the loss of profit to the medical industry. Finally, the usual method of computing consumer’s surplus by integration under the demand curve is still appropriate in markets with moral hazard.

Gaynor, Martin
Vogt, William
Working Paper2002
Change, Consolidation, and Competition in Health Care Markets

The health care industry is being transformed. Large firms are merging and acquiring other firms. Alliances and contractual relations between players in this market are shifting rapidly. Within the next few years, many markets are predicted to be dominated by a few large firms. Antitrust enforcement authorities like the Department of Justice and the Federal Trade Commission, as well as courts and legislators at both the federal and state levels, are struggling with the implications of these changes for the nature and consequences of competition in health care markets. This paper summarizes the nature of the changes in the structure of the health care industry. This paper focus on the markets for health insurance, hospital services, and physician services. Potential implications of the restructuring of the health care industry for competition, efficiency, and public policy is discussed. As will become apparent, this area offers a number of intriguing questions for inquisitive researchers.

Gaynor, Martin
Working Paper2002
Competition Among Hospitals

Competition Among Hospitals
 

Gaynor, Martin
Vogt, William
Working Paper2002
Competition Between Internet Search Engines

This paper develops a model of vertical differentiation in the Internet search engine market. A key property of the model is that users who try out one engine may be dissatisfied with the results, and consult another engine in the same session. This residual demand allows lower quality engines to survive in the equilibrium. We consider a two-period game between an incumbent and an entrant who enters in the second period. Since users prefer to try out a higher quality engine first, the demand for an engine is discontinuous in quality, depending on whether the engine has high or low quality. We take into account brand loyalty for the incumbent. The interaction of brand loyalty and a cost advantage for the entrant determines which engine has higher quality in equilibrium.

Telang, Rahul
Working Paper2002
Consumer Surplus in the Digital Economy: Estimating the Value of Increased Product Variety

This paper presents a framework and empirical estimates that quantify the economic impact of increased product variety made available through electronic markets. Recent research has focused on the effect of increased competition on Internet market efficiency. While these efficiency gains significantly enhance consumer welfare, for instance by leading to lower average selling prices, our present research shows that increased product variety made available through electronic markets can be a significantly larger source of consumer welfare gains. One reason for increased product variety on the Internet is the ability of online retailers to stock, display, and sell a large number of products. There may also be large welfare gains in other SKU-intensive consumer goods such as music, movies, consumer electronics, and computer software and hardware.

Johnson, Michael
Working Paper2002
Counterterror and Counterdrug policies: Comparisons and Contrasts

Counterterror and Counterdrug policies: Comparisons and Contrasts

Caulkins, Jonathan
Working Paper2002
Cycles of Violence: A Dynamic Control Model

This paper introduce and analyze a simple model of cycle of violence in which oscillations are generated when surges in lethal violence shrink the pool of active violent offenders. Models with such endogenously induced variation may help explain why historically observed trends in violence are generally not well correlated with exogenous forcing functions, such as changes in the state of the economy. The analysis includes finding the optimal dynamic trajectory of incarceration and violence prevention inteverventions. Those trajectories yield some surprising results, including situations in which myopic decision makers will invest more in prevention than will far-sighted decision makers.
 

Caulkins, Jonathan
Working Paper2002
Drug Policy: Insights from Mathematical Analysis

Illicit drug use is clearly an important health problem. There are some 600,000 emergency department episodes in the US every year that are related to illicit drugs (SAMHSA, 2002a). National mortality estimates are not available, but there are probably on the order of 20,000 drug-induced deaths a year (SAMHSA, 2002b), with many more indirectly related to drug use. Some 5 million Americans are in need of drug treatment, and less than 40% get it (Epstein and Gfroerer, 1998; Woodward et al., 1997). Injection drug use is a leading cause of the spread of infectious diseases such as HIV/AIDS and Hepatitis C (CDCP, 2001). The social costs of illicit drug use approach those of alcohol and tobacco (Rice et al., 1990; Bartlett et al., 1994; Harwood et al., 1998). No one has estimated how many quality adjusted life years are lost due to illicit drug use, but the number is no doubt substantial, particularly since those who die from illicit drug use are younger than those who die from most other causes. Not surprisingly there is an energetic debate concerning how best to control drug use and related consequences, to which Operations Research/Management Science has made important contributions. Nevertheless, drug policy is unlike other health policy domains in important ways, and this article begins with a review of some important differences. The following sections then highlight key insights quantitative models have generated concerning the relative effectiveness of different interventions, including how that effectiveness varies over the course of a drug epidemic.
 

Caulkins, Jonathan
Working Paper2002
Estimating the Relative Efficiency of Various Forms of Prevention at Different Stages of a Drug Epidemic

Drug use and problems change dramatically over time in ways that are often described as reflecting an "epidemic cycle". We use simulation of a model of drug epidemics to investigate how the relative effectiveness of different types of prevention varies over the course of such an epidemic. Specifically we use the so-called LHY model (see Behrens et al., 2000b) which includes both "contagious" spread of initiation (a positive feedback) and memory of past use (a negative feedback), which dampens initiation and, hence, future use. The analysis confirms the common sense intuition that prevention is more highly leveraged early in an epidemic, although the extent to which this is true in this model is striking, particularly for campaigns designed to leverage awareness of the drug’s dangers. The findings also suggest that the design of "secondary" prevention programs should change over the course of an epidemic.

Caulkins, Jonathan
Working Paper2002
Evaluating Local Studies of Barriers to Fair Housing

Evaluating Local Studies of Barriers to Fair Housing
 

Johnson, Michael
Working Paper2002
Guns and Youth Violence: An Examination of Crime Guns in One City

Firearms are an important factor in violent crimes. Nationally, the percentage of violent offenses that involve use of a firearm closely tracks changes in the supply of newly manufactured pistols (Figure 1). As more pistols became available their use in violent crimes increased. After 1985, firearms were especially implicated in the dramatic rise in juvenile homicide rates, both as victims (Fingerhut, 1993; Fingerhut, et al., 1998) and offenders (Blumstein, 1995). While juvenile rates of homicides by gun surged upward, both adult and nongun juvenile homicide rates remained relatively flat during the same period (Blumstein and Cork, 1996; Cork, 1996). While the link between guns and youth homicides is compelling in aggregate data, very little is known about how gun availability actually affects individual behavior among youth, whether that effect differs between young adults and juveniles, and whether that relationship has changed over time. The research discussed here examines spatial and temporal features of crime guns in one city. The analysis focuses on attributes of crime guns and those who possess them, the geographic sources of those guns, the distribution of crime guns over neighborhoods in a city, and the relationship between the prevalence of crime guns and incidence of violent crimes especially homicides.

Cohen, Jacqueline
Gorr, Wilpen
Working Paper2002
Household Demand for Employer-Based Health Insurance

Household Demand for Employer-Based Health Insurance

Gaynor, Martin
Vogt, William
Working Paper2002
Impact of Police Raids at Nuisance Bars on Illegal Drug Dealing: Estimating Intervention Effects in Varying Risk Settings

This paper examines the effects of police raids at nuisance bars on drug dealing in and around the nuisance bar. We examine effects of both dosage (number of raids) and duration of the intervention, as well as the conditioning effects of land use and population characteristics in shaping the underlying risk levels of drug dealing in the target and surrounding displacement areas. Results indicate that the police intervention does suppress levels of drug dealing during periods of active enforcement, but these effects largely disappear when the intervention is withdrawn. Also, the effects of the intervention are mediated by risk characteristics in target and displacement areas. In general, target areas characterized by higher levels of risk are more resistant to intervention effects than those with lower levels or risk. Risk factors in nearby displacement areas are also significant. Bars with high levels of risk arising from land uses in surrounding areas are easier to treat, while bars with high levels of population-based risk in surrounding displacement areas are harder to treat.

Cohen, Jacqueline
Gorr, Wilpen
Working Paper2002
Innovative Solutions in Providing Access to Micro-data

State of the art: An overview of policy and practice on release of microdata

Statistical offices must provide data products that are both useful and have low risk of confidentiality disclosure. Recognizing that deidentification of data is generally inadequate to protect their confidentiality against attack by a data snooper, agencies can release microdata under policies of restricted access or can release products of restricted data. Under a policy and practice of restricted access, administrative procedures impose conditions on user access to data. These conditions may depend on the type of data user; conditions may be different for interagency data sharing than for external data users. Various restricted access policies (Jabine 1993a,b) have been implemented in the last twenty years.

Duncan, George
Presentations and Proceedings2002
Mediating the Tension Between Information Privacy and Information Access: The Role of Digital Government

Government agencies collect and disseminate data that bear on the most important issues of public interest. Advances in information technology, particularly the Internet, have multiplied the tension between demands for evermore comprehensive databases and demands for the shelter of privacy. In mediating between these two conflicting demands, agencies must address a host of difficult problems. These include providing access to information while protecting confidentiality, coping with health information databases, and ensuring consistency with international standards. The policies of agencies are determined by what is right for them to do, what works for them, and what they are required to do by law. They must interpret and respect the ethical imperatives of democratic accountability, constitutional empowerment, and individual autonomy. They must keep pace with technological developments by developing effective measures for making information available to a broad range of users. They must both abide by the mandates of legislation and participate in the process of developing new legislation that is responsive to changes that affect their domain. In managing confidentiality and data access functions, agencies have two basic tools: techniques for disclosure limitation through restricted data and administrative procedures through restricted access. The technical procedures for disclosure limitation involve a range of mathematical and statistical tools. The administrative procedures can be implemented through a variety of institutional mechanisms, ranging from privacy advocates, through internal privacy review boards, to a data and access protection commission.

Duncan, George
Roehrig, Stephen
Working Paper2002
Optimizing Caching in Object-Oriented Applications

Optimizing Caching in Object-Oriented Applications

Working Paper2002
Site Selection for Location of Community Corrections Centers

Community corrections centers (CCCs) are known to be an effective criminal justice strategy. However, location of CCCs is challenging: residents of potential destination communities often regard them as "undesirable" land uses. This paper develops a methodology by which a small set of feasible CCC locations may be identified from a larger initial set using data on neighborhood characteristics and currently operating CCCs. This methodology is based on a multi-stakeholder deliberative process and spatial analysis. Results from this process may be used as inputs to models that identify CCC sites that optimize various policy criteria. A portion of this methodology is applied to data from Pittsburgh, PA and demonstrate that the process is reasonable in terms of data requirements, stakeholder preference elicitation and outcomes generation. Preliminary results indicate a set of potential CCC sites are selected that is more diverse than that which might be identified using real estate or community corrections expertise alone.

 

Johnson, Michael
Working Paper2002
Statistical Data Stewardship in the 21st Century: An Academic Perspective

This paper presents an academic perspective on a broad spectrum of ideas and best practices for statistical data collectors to ensure proper stewardship for personal information that they collect, process and disseminate. Academic researchers in confidentiality address statistical data stewardship both because of its inherent importance to society and because the mathematical and statistical problems that arise challenge their creativity and capability. To provide a factual basis for policy decisions, an information organization (IO) engages in a two-stage process: (1) It gathers sensitive personal and proprietary data of value for analysis from respondents who depend on the IO for confidentiality protection. (2) From these data, it develops and disseminates data products that are both useful and have low risk of confidentiality disclosure. The IO is a broker between the respondent who has a primary concern for confidentiality protection and the data user who has a primary concern for the utility of the data. This inherent tension is difficult to resolve because deidentification of the data is generally inadequate to protect their confidentiality against attack by a data snooper. Effective stewardship of statistical data requires restricted access or restricted data procedures. In developing restricted data, IOs apply disclosure limitation techniques to the original data. Desirably, the resulting restricted data have both high data utility U to users (analytically valid data) and low disclosure risk R (safe data). This paper explores the promise of the R-U confidentiality map, a chart that traces the impact on R and U of changes in the parameters of a disclosure limitation procedure. Theory for the R-U confidentiality map is developed for additive noise. By an implementation through simulation methods, an IO can develop an empirical R-U confidentiality map. Disclosure limitation for tabular data is discussed and a new method, called cyclic perturbation, is introduced. The challenges posed by on-line access are explored.

Duncan, George
Working Paper2002
The Dynamic Character of Drug Problems

This paper makes three points. (1) Drug-related measures, such as the number of users, have changed rapidly over time, suggesting that they are not merely symptoms of underlying trends in the economy, demographics, or other aggregates that change more slowly. (2) Drug markets are subject to a wide range of feedback effects that can induce non-linearity into dynamic behavior. (3) There are at least five classes of epidemic models that reflect such non-linear dynamic behavior. Some of those classes tend to be optimistic about the ability of drug control interventions to reduce use; others are pessimistic. It is hoped that this discussion and, in particular, the typology, can inform and elevate the debate about drug policy, but it is unlikely to resolve that debate because of the inability to demonstrate empirically which class(es) are most accurate.

Caulkins, Jonathan
Working Paper2002
The Impact of Shopbots on Electronic Markets

Internet shopbots are automated tools that allow customers to easily search for prices and product characteristics from online retailers. Some market observers have predicted that shopbots will benefit consumers at the expense of retailers. In this view, shopbots will radically reduce consumer search costs, reduce retailer opportunities to differentiate their products, and as a result will drive retailer margins toward zero. However, a review of the literature suggests that, while shopbots may place pressure on retailer margins in some circumstances, retailers retain numerous opportunities to differentiate their products, leverage brand names, set strategic prices, and reduce the effectiveness of consumer search at shopbots. The paper closes by identifying significant questions for future research.
 

Smith, Michael
Working Paper2002
The Law of One Price? The Impact of IT-Enabled Markets on Consumer Search and Retailer Pricing

Recent IT research has analyzed how the performance of IT-enabled markets may differ from conventional markets. This literature has made two unexpected empirical findings. First, ITenabled markets for commodity goods exhibit significant price dispersion. Second, well-known retailers in these markets appear to cooperate to set high prices. This paper presents an analytic model, and confirmatory empirical evidence, that explains this behavior as a response to the unique characteristics of consumer search in electronic markets. In conventional markets, consumer search costs are primarily a function of the consumer’s physical proximity to retailer outlets - and physical proximity is distributed relatively equally across retailers. In electronic markets, consumer search costs are primarily a function of the consumer’s mental awareness of different retailers - and this awareness is likely to be concentrated in the hands of a few retailers. Based on this model of consumer search, IT-enabled markets for commodity goods exhibit high price dispersion in equilibrium and a few well-known retailers are able to cooperate to set high prices. The predictions of the model are shown to be consistent with empirical data for 23,744 books collected from 24 Internet retailers in late 1999. Viewing consumer search in this manner provides a useful starting point for understanding the likely development of IT-enabled markets, and for understanding the importance of advertising and first-mover advantage for electronic market participants.

Smith, Michael
Working Paper2002
To What Degree Does Food Assistance Help Poor Households Acquire Enough Food?

This paper studies the efficacy of public and private food assistance in alleviating food shortages among poor households by considering the effects of all major forms of domestic food assistance- the Food Stamp Program, WIC, and food pantries. The analyses are based on detailed data that were collected in 1993 from 398 low-income households in Allegheny County, Pennsylvania. This research adds to the knowledge base on the efficacy of public and private food assistance in alleviating food shortages among poor households by jointly considering the effects of both public and private forms of food assistance. After reconsidering standard food consumption models, the analysis modifies these models to account for misspecification and extends these models to include the effects of both public and private food assistance. Then, the paper examines the effect each of the widely available forms of food assistance has on helping poor households acquire enough resources to potentially meet basic nutritional requirements. Research findings suggest that compared with other forms of food assistance, the receipt of a significant amount in food stamps has a much greater impact on whether a household attains at least the Thrifty Food Plan than the receipt of food from a food pantry or through the WIC program.
 

Daponte, Beth
Haviland, Amelia
Working Paper2002
Using Models that Incorporate Uncertainty

What does a practicing policy analyst need to know about using models that incorporate uncertainty? One could write volumes focusing only on models that produce specific numeric forecasts (as opposed to conceptual models that address uncertainty qualitatively). But in the 21st century sound bites are more useful. In short: The world is uncertain; but sampling variability isn’t the driver; and so there is no excuse not to simulate. The real challenge, though, is to communicate effectively about models that incorporate uncertainty.

Working Paper2002
Consumer Decision-making at an Internet Shopbot: Brand Still Matters

Over the coming century, computer technology is likely to become capable of reproducing many of the skills now performed by human labor. This paper describes three models of the aggregate economic changes that occur when capital becomes capable of performing human work skills. The basic model, with a single sector and homogeneous labor, projects output growth rates over the next few decades that are substantially above historical growth rates in industrialized countries, assuming plausible increases in computer skill. The projected output growth is accompanied by structural changes reflecting the reduced role of labor, with wage growth lagging output growth and the labor share of output decreasing. Resource limits do not substantially affect the levels of output and wage growth in the near future. The 2-type model, with fixed skill differences between different workers, produces similar growth in output and average wages over the next several decades. However, the worker skill differences produce large increases in wage inequality between types of workers. The 2-sector model, with different skill requirements for different economic sectors, also produces similar growth in output and wages over the next several decades. For the three models, asymptotic growth in output and wages is substantially reduced by resource limits, worker skill differences, and sector skill differences, even though those constraints do not substantially reduce growth over the next few decades. The models produce patterns of change in the labor share and capital-output ratio that are consistent with broad trends in economic data.
 

Smith, Michael
Working Paper2001
Corporate Restructuring and R&D: A Panel Data Analysis for the Chemical Industry

This paper contributes a novel approach to the existing literature on the effects of restructuring on R&D investment by focussing on a single industry, chemicals. The chemical industry is very research intensive and has experienced thorough restructuring since the early 1980s. By focussing on a single industry we are able to identify the technological and R&D features of its segments. This is important, since there is evidence that restructuring affects R&D differently in businesses with different technological features. However, no study so far has provided a systematic inquiry into this link. Using a panel of 535 European, American, and Japanese firms for the years 1987-1997 we find restructuring to be an important component in the observed changes in R&D intensity. This paper shows that restructuring affects R&D both through changes in size and through changes in the composition of business portfolios, and that these effects differ across industry segments.
 

Arora, Ashish
Working Paper2001
Facilitating Negotiations between Stakeholders in Subsidized Housing Planning

Facilitating Negotiations between Stakeholders in Subsidized Housing Planning

Working Paper2001
In the footsteps of the Silicon Valley? Indian and Irish software in the international division of labour

This paper analyses the development of software in India and Ireland. The development patterns of the software industry in Ireland and India clearly show both the advantages and disadvantages of being a follower. The most obvious advantage is the ability to sustain growth without a broad based set of technical capabilities, at least initially. With the leaders creating and defining markets, and possibly even the business models, and the policy and technical infrastructure required, many uncertainties are greatly reduced. Moreover, in many instances, multinationals from the leading countries can catalyse growth and may even, as in Ireland, account for a substantial part of the initial growth. On the other hand, relatively narrow sources of competitive advantage imply that the firms in the follower industries tend to be similar in capabilities, with competition among them transferring the bulk of the benefits to customers overseas. Sophisticated and well established competitors located in the leading clusters stand in the way of followers moving up the value chain, leaving innovative firms to search for new niches, and ways to link to lead users. Moreover, clusters in the followers lack the thick vertical and horizontal links, that are important for knowledge spillovers, innovation and growth. However, this analysis, which draws on the evidence collected in India and Ireland through two surveys of domestic firms and foreign-owned firms, also suggests that early success, narrowly based though it may be, can lay a foundation for future growth that is based more on innovation.

 

Arora, Ashish
Working Paper2001
Pieces of the Action: Ownership and the Changing Employment Relationship

This essay develops and links two models of ownership: first a composition model of the dimensions comprising ownership in firms, and a content model specifying the societal, firm, and individual factors that give rise to workers’ motivation to participate in ownership and employers’ motivation to share ownership. Ownership comprises financial participation, including control over residual assets, access to marginal revenues, participation in decision making, and access to financial information; along with sociopsychological factors including social standing, social responsibility, and psychological ownership. Firm ownership, across financial and sociopsychological facets, is increasingly parcelled out among financial investors, managers and workers. This new distribution of ownership is particularly characteristic of high technology and start up firms, due to the mobility of highly skilled workers, and their consequent power in the employment relationship. We specify how societal factors, firm characteristics, and worker qualities impact the motivation to own and the motivation to share ownership. By focusing on the shifting power balance of highly mobile workers, this treatment of emerging ownership practices provides a theoretical basis for understanding the employment relationship in start ups and high technological firms.

Rousseau, Denise
Working Paper2001
Point Demand Forecasting

This paper provides geographic information system (GIS) methods and empirical models to forecast point demand for home-delivered goods. A point forecast consists of stops on a street network, including demand at each stop. The purpose of the forecast is to support a network optimization model, based on the traveling salesman problem, to locate one or more new facilities in a region. This paper illustrate the approach with a case study of home-delivered meals (meals ons wheels) in Allegheny County, Pennsylvania.
 

Gorr, Wilpen
Johnson, Michael
Roehrig, Stephen
Working Paper2001
A Typology for Network Measures for Organizations

Numerous measures of organizational structure have been developed. The goal is to develop a small meaningful and predictive set. Work in this area, however, has been hampered by a lack of a standard categorization schema. Such a schema is presented herein. This schema is based on the recognition that many aspects of organizational structures can be represented as graphs.
 

Carley, Kathleen
Krackhardt, David
Working Paper2000
Adaptive Organizations and Emergent Forms

Over time organizations change and coordinate personnel in new ways. Such changes may be precipitated by actual or anticipated changes in personnel, the environment, technologies, legislation, or the top management team. This adaptation is constrained and not all forms of coordination are feasible. Since organizations are inherently computational entities insight is gained by examining the adaptation of organizations using intelligent artificial agents. Using ORGAHEAD, a multi-agent model of organizational behavior, a series of virtual experiments were run to examine issues of organizational adaptation. Results suggest the concurrent occurrence of experiential learning and structural learning generates within the organization the ability to learn meta-change strategies which can be either adaptive or maladaptive. Such meta-change strategies effectively lock organizations into divergent paths of behavior which produce heterogeneity of form across the population of organizations. Organizational performance and form depend on a complex of array of factors including environmental change, experiential and structural learning, and the emergence of institutionalized strategies.
 

Carley, Kathleen
Working Paper2000
Assessment of Crime Forecasting Accuracy for Deployment of Police

Crime forecasting is a new area of research, following upon the success of crime mapping for support of tactical deployment of police resources. The major question investigated in this paper is whether it is possible to accurately forecast crime one month ahead at a “smallscale” aggregation, i.e., at the precinct level. In a case study of Pittsburgh, Pennsylvania, we contrast the forecast accuracy of standard, univariate time series models with non-modeling practices commonly used by police. Included is a comparison of seasonality estimates made by precinct versus the city as a whole. As suspected for the small-scale data of this problem, average crime count by precinct and crime type is the major determinant of forecast accuracy. A fixed effects regression model of absolute percent forecast errors shows that such counts need to be on the order of 30 or more to achieve accuracy of 20 percent error or less. A second major result is that practically any model-based forecasting approach is vastly more accurate than current police practices. Thirdly, this is the first empirical paper to investigate crime seasonality at the sub-city level. Our seasonality estimates provide evidence supporting the routine activities theory of crime, but not earlier theories.
 

Gorr, Wilpen
Working Paper2000
Clinical Reminder System: A Relational Database Application for Evidence-Based Medicine Practice

Evidence-based medicine is the distillation of a large volume of medical research and standards into treatment protocols for diseases and preventive care procedures that represent the most accurate knowledge available. In this project, we implement evidence-based medicine principles via a decision support system that provides suggested actions for physicians based on individual patient characteristics and established treatment protocols. Such a reminder system may enable physicians to make better-quality decisions, and may enable patients more consistently follow medical recommendations. This papers presents a prototype DSS, called Clinical Reminder System, that combines a relational database, a knowledge base consisting of algorithms that implement disease treatment protocols, integration with hospital legacy systems and a web-based interface allowing for physician management of patient data and suggested medical responses. This application has been in use within a clinical setting since 2001. Formal evaluation and assessment of patient outcomes associated with use of this system is currently being performed by Carnegie Mellon University and The Western Pennsylvania Hospital.

 

Johnson, Michael
Padman, Rema
Working Paper2000
Distinguishing Between Effects of Criminality and Drug Use on Violent Offending

The alarming increase in lethal violence among young people in the U.S.-which is often attributed to drug use and drug trafficking-has prompted re-examination of the relationship between drugs and violent offending. While no national data exist, numerous local studies find a high prevalence of homicide deaths among identified drug addicts, a high prevalence of substance use-typically alcohol-among victims of homicide, and a high proportion of persons testing positive for drug use among arrestees for violent offenses. Other studies report large increases in drug-related homicides or other violence associated with drug distribution. In a departure from previous research that contrasts users and nonusers of drugs, or compares broad periods of heavy and light drug use during long addiction careers, the present study attempts to isolate more direct effects of drug use near the time of offending. The data are for a sample of adults arrested in Washington, DC from July 1, 1985 to June 30, 1986, and include their longitudinal arrest histories along with the results of urine drug screens administered following arrest.

Caulkins, Jonathan
Cohen, Jacqueline
Working Paper2000
Do You Go to College If Your Parents Want You To?

Do You Go to College If Your Parents Want You To?
 

McElroy, Susan
Working Paper2000
Facility Location Model for Home-Delivered Services: Application to the Meals-on-Wheels Program

This paper presents a GIS-based decision support system for the non-profit sector, designed to assist strategic and tactical decision making in the area of home-delivered services such as meals on wheels. Using data collected from existing programs, current and forecasted demographic data, and a series of algorithmic tools, this paper provides a system for evaluating current meals on wheels facilities, and for making facility location decisions that satisfy coverage and equity requirements.

Gorr, Wilpen
Johnson, Michael
Roehrig, Stephen
Working Paper2000
Frictionless Commerce? A Comparison of Internet and Conventional Retailers

There have been many claims that the Internet represents a new nearly "frictionless market." Our research empirically analyzes the characteristics of the Internet as a channel for two categories of homogeneous products-books and CDs. Using a data set of over 8,500 price observations collected over a period of 15 months, we compare pricing behavior at 41 Internet and conventional retail outlets. It is found that prices on the Internet are 9-16% lower than prices in conventional outlets, depending on whether taxes, shipping, and shopping costs are included in the price. Additionally, it is found that that Internet retailers’ price adjustments over time are up to 100 times smaller than conventional retailers’ price adjustments-presumably reflecting lower menu costs in Internet channels. Also found that that levels of price dispersion depend importantly on the measures employed. When comparing the prices posted by different Internet retailers, substantial dispersion is found. Internet retailer prices differ by an average of 33% for books and 25% for CDs. However, when these prices are weighed by proxies for market share, it is found that dispersion is lower in Internet channels than in conventional channels, reflecting the dominance of certain heavily branded retailers.



Smith, Michael
Working Paper2000
Intellectual Property Strategies and the Returns to R&D

Although the prospect of obtaining patent protection is believed to encourage R&D investments and thus the rate of inventive activity, there is little by way of direct evidence to support this belief. This paper uses original data from the 1994 Carnegie Mellon survey on the appropriation of R&D in the US manufacturing sector to empirically estimate a structural model linking a firm’s choice of the optimal level of R&D efforts with its intellectual property protection strategy. Thhe use and effectiveness of different technological strategies, including patenting, secrecy and the exploitation of first mover advantages, as conditioning the effect of firms and industry characteristics such as firm size and competitive pressure on the returns to firms’ inventive activity is explicitely modeled. The analysis also incorporates the role of information spillovers and other organizational factors influencing the productivity of R&D investments. A key result is that the effectiveness of a firm’s patenting strategy is one of the main determinants of R&D efforts and thus the production of inventions in only selected industries.
 

Arora, Ashish
Working Paper2000
Intra-Organizational Computation and Complexity

Organizations are complex systems. They are also information processing systems comprised of a large number of agents such as human beings. Combining these perspectives and recognizing the essential non-linear dynamics that are at work leads to the standard non-linear multi-agent system conclusions such as: history matters, organizational behavior and form is path dependent, complex behavior emerges from individual interaction, and change is inevitable. Such a view while descriptive, is still far from the level of specificity and predictive richness that is necessary for organizational theory. To increase the specificity and value of our theories we will need to take into account more of the actual attributes of tasks, resources, knowledge and human cognition. In doing so, it will be possible to achieve a more adequate description of organizations as complex computational systems. More importantly, we will also achieve a greater ability to theorize about the complexity of organizational behavior. This paper describes complexity theory and computational organization theory. Then a description of organizations as complex computational systems is presented and operationalized as a computational model. Within this perspective, organizational behavior results form the actions of heterogeneous actors, the boundaries between agents, tasks, and resources are permeable, organizational roles emerge, organizational groups are networks, and information technology plays a key role as an interactive agents.
 

Carley, Kathleen
Working Paper2000
Markets for Technology and Their Implications for Corporate Strategy

Although market transactions for technologies, ideas, knowledge or information are limited by several well known imperfections, there is increasing evidence that they have become more common than in the past. In this paper we argue that these markets change the traditional mindset in which the only available option for a company wishing to introduce an innovation is to develop the technology in-house, or for a company developing the technology to own the downstream assets needed to manufacture and commercialize the goods. This affects the role of companies both as technology users (they can "buy" technologies) and as technology suppliers (they can "sell" technologies). The implications for management include more proactive management of intellectual property, greater attention to external monitoring of technologies, and organizational changes to support technology licensing, joint-ventures and acquisition of external technology. For entrepreneurial startups, markets for technology make a focused business model more attractive. At the industry level, markets for technology may lower barriers to entry and increase competition, with obvious implications for the firms’ broader strategy as well.
 

Arora, Ashish
Working Paper2000
PCANS Model of Structure in Organizations

This paper presents a network based approach to characterize organizational architectures in terms of three domain elements - individuals, tasks, and resources. Characterizing the possible relations among these elements results in five relational primitives - Precedence, Commitment of resources, Assignment of individuals to tasks, Networks (of relations among personnel) and Skills linking individuals to resources. It is demonstrated that the utility of this model for re-characterizing classical organizational theory and for generating a series of testable hypotheses about organizational performance.
 

Carley, Kathleen
Krackhardt, David
Working Paper2000
Property Rights, Firm Boundaries, and R&D Inputs

This paper provides an explanation of the role of intellectual property rights (IPRs) in information- intensive vertical supply relationships. Specifically, the connection between stronger property rights and the enhanced viability of specialized (versus vertically integrated) input suppliers under incomplete contracts and information spillovers is explored. Information spillovers arise due to the supplier’s effort to customize its generalized technology to the specific needs of the buyer. This paper starts by modeling a tradeoff between incentives and two types of information spillovers: "synergies," in which joint efforts reveal new applications of existing technology; and "leakage," or disclosure of existing information. Whereas incentives for customization are higher under specialization, integration internalizes spillovers and prevents rent dissipation. IPRs favor specialization by reducing buyer opportunism, and ceteris paribus, leakage favors integration relative to synergies. The basic results are extended to analysis of buyouts and spinoffs, and assay an extensive body of empirical evidence that provides broad support for the approach used here.


 

Arora, Ashish
Working Paper2000
Reforming the Eighth-Grade Student Assignment Process for the Philadelphia Public Schools

The eighth-grade student assignment project is an initiative of the School District of Philadelphia that assigns students to high school academic programs based on student preferences, academic preparation, program capacity and desegregation requirements. This paper describes recent modifications to the eighth-grade student assignment process that resulted in: a comprehensive, realistic description of business processes, a new, more accurate method for recording family preferences and modifications to the student assignment process that better reflects student preferences and system constraints. This paper describes other recommendations not yet implemented, including more flexible program reporting to families, a relational database to meet future student processing needs, customer service via the Internet and management science models to automate the assignment process.

 

Johnson, Michael
Working Paper2000
Should Professional Boxing Change Its Scoring System? A Comparison of Current and Proposed Methods

In the aftermath of the controversial draw verdict in the first bout between Evander Holyfield and Lennox Lewis, numerous suggestions have been advanced to reform the process of scoring professional boxing contests. This paper compares the status quo scoring system in boxing, the 10-Point Must System, to New Jersey’s 10-Point Majority System and the "consensus scoring" technique currently being considered by the New York State Legislature. The three scoring systems are compared on theoretical grounds and by comparing the results when the systems are applied to every world title fight sanctioned by the sport’s four major sanctioning bodies between 1986 and 1999 that was decided on the judges’ scorecards.

Algranati, David
Working Paper2000
SMMM: A Metric Based Framework to Evaluate the Scalability of Multiple Model Methodologies

Multiple Model Methodologies (MMM) have become ubiquitous in the area of conceptual modeling. Thus, the Unified Modeling Language (UML) is a popular MMM in software engineering, while MMMs are also used in enterprise modeling. Over the last few years, the size problem domains being modeled by MMMs has also grown. Thus, software is now bigger, and enterprises are significantly larger in scale than the problem domains modeled when building the legacy systems. These trends in the area of conceptual modeling raise an important question about the scalability of MMMs, as they are applied to domains of increasing size. In this work, we present a comprehensive, metric based framework called SMMM (Scalability of MMM). SMMM assumes that the only obstacle to the scalability of an MMM is the complexity that users face when using the MMM to create models of the underlying reality, as this reality grows in size. SMMM extends previous work in the area of complexity measurement in the following ways. First, SMMM is comprehensive, and yet parsimonious. Second, metrics in earlier works have made explicit assumptions about the relative cognitive difficulties of modeling different categories of concepts. SMMM makes no assumptions about any concept being harder to model than another. Third, previous work on metric development has omitted the role of empirical work in understanding complexity. The SMMM framework explicitly recognizes the role of empirical work in evaluating cognitive difficulty. Fourth, SMMM measures both intra-model and intermodel complexity. Intra-model complexity values show which models in the MMM are the source of complexity. Inter-model complexity values measure the complexity by the interaction between different models in the MMM.
 

Bajaj, Akhilesh
Working Paper2000
Statistical Methods to Evaluate Geographically-Targeted Economic Development Programs

In recent years an increasing amount of efforts has been devoted to the evaluation of geographically-targeted economic development (GTED) programs. In the U.S. and in Great Britain, geographically-targeted business incentives (denominated Enterprise Zone programs) are an important policy instrument to revitalize local communities. Within the E.U., interest on the evaluation of GTED programs is fueled by the number of development programs cofunded by the European Regional Development Fund, the European Social Fund and the European Agricultural Guidance and Guarantee Fund. The surging interest for the evaluation of GTED programs is challenged by the difficulty to assess the causality link between the program intervention and the observed changes in the economic outcomes of interest. Evaluating GTED programs is a difficult task because it requires the evaluator to distinguish changes due to the program from changes due to the many factors independent from the program intervention. Such a task is particularly difficult also due to the lack of experimental data available to the evaluator. This paper illustrates the sources of the potential biases that can affect impact estimates of GTED programs, and develop a number of statistical methods that control for such sources. The proposed methods are then grouped and sorted out in a decision tree algorithm that provides guidance to select the most appropriate methodology for the analysis, based on the program characteristics and on the type of data available. An evaluation of the impact of the U.S. Enterprise Zones on local employment concludes the paper as an empirical application of the methods and the decision tree algorithm proposed. This application highlights how seriously distorted impact estimates can be when they are obtained using unsophisticated tools for the analysis. The methods proposed in the paper proved instead to be effective tools to avoid these distortions.
 

Working Paper2000
Trust, Risk and Electronic Commerce: Nineteenth Century Lessons for the 21st Century

Trust, Risk and Electronic Commerce: Nineteenth Century Lessons for the 21st Century
 

Working Paper2000
Understanding Digital Markets:Review and Assessment

As the Internet develops into a robust channel for commerce, it will be important to understand the characteristics of electronic markets. Businesses, consumers, government regulators, and academic researchers face a variety of questions when analyzing these nascent markets. Will electronic markets have less friction than comparable conventional markets? What factors lead to dispersion in Internet prices? What are the major electronic commerce developments to watch in the coming years? This paper addresses these questions by reviewing current academic research, discussing the implications of this research, and proposing areas for future study. The paper reviews evidence that Internet markets are more efficient than conventional markets with respect to price levels, menu costs, and price elasticity. However, several studies find substantial and persistent dispersion in prices on the Internet. This price dispersion may be explained, in part, by heterogeneity in retailer-specific factors such as trust and awareness. In addition, the paper notes that Internet markets are still in an early stage of development and may change dramatically in the coming years with the development of cross-channel sales strategies, infomediaries and shopbots, improved supply chain management, and new information markets.
 

Smith, Michael
Working Paper2000
A Plan Induction System for Monitoring and Interpreting Operator Interventions in Process Control Environments

This paper describes the architecture and behavior of a prototype intelligent decision support system for monitoring operations in complex process control environments. Development of the underlying model required an examination of the various influences on process outcomes, including not only the causal nature of physical processes themselves, but also the role of human interventions and the associated impact of operating procedures on human behavior. The empirical study of nuclear power plant operations used in this research indicates that procedures are an important, but not necessarily deterministic, influence on the intervening behavior of an operator. Operators will deviate from procedures when the requirements of a situation render a procedure inadequate or counterproductive. Goal- and plan-based knowledge structures were derived from physical processes, operating procedures, and human operators. These structures were incorporated into the model's knowledge base, which serves as the basis for interpretation and prediction of operator interventions in a series of emergency scenarios in simulated real-time. The eventual goal of this research is to enhance management oversight and control of complex, dynamic task environments by providing both management and operators with advice that is informed by an understanding of the constituent influences on process outcomes.

Peters, James
Working Paper1999
A Prototype Decision Aid for Internal Control Testing Plan Selection

The research reported here takes a preliminary step to providing auditors with decision support for making assertion-level control risk assessments by developing a prototype decision aid that helps auditors select an optimal control testing plan designed to achieve target assertion-level control risk assessments. The aid supports the control testing plan selection decision by modeling accounting information system components based on their impacts on financial statements assertions, providing an evidence combination algorithm based on reliability theory, and identifying all optimal control testing plans. The aid was validated by comparing its testing plans to both experienced auditors and a professional benchmark. The results indicate that the aid’s testing plans test sufficient controls to provide auditors with their desired assurance but do so by testing fewer controls than either experienced auditors or a professional benchmark.
 

Peters, James
Working Paper1999
Do Green Businesses Benefit Communities?

Do Green Businesses Benefit Communities?
 

Davison, Derek
Florida, Richard
Working Paper1999
Engine or Infrastructure? The University Role in Economic Development

Engine or Infrastructure? The University Role in Economic Development
 

Florida, Richard
Working Paper1999
Enterprise Zones and Local Employment: Evidence From the States' Programs

Many states respond to deteriorating economic conditions in their inner cities and rural communities by establishing geographically targeted tax incentives. This paper examines the impact of several of these Enterprise Zone (EZ) programs on local employment. The results show that the EZ programs do not have a significant impact on local employment. Program impact does not depend on the monetary amount of the incentives and or on specific features of program design. These conclusions are constant across two econometric approaches to controlling for the non-random placement of zones and stand up to a wide variety of sensitivity analyses.

Engberg, John
Working Paper1999
Forecasting Crime

Organizations in the private sector must do strategic planning over long-term horizons to locate new facilities, plan new products, develop competitive advantages, and so forth. Consequently, long-term forecasts of demand, costs of raw materials, etc. are important in the private sector. There is no such strategic counterpart to police work; consequently, long-term forecasts are of little value to police. Police primarily need short-term forecasts; for example, crime levels one week or one month ahead. Currently, police mostly respond to new crime patterns as they occur. Client-server computing for realtime access to police records and computerized crime mapping have made it possible for police to keep abreast with crime. With short-term forecasting police may be able to get one step ahead of criminals by anticipating and preventing crime. The organization of this paper proceeds first with a description of short-term forecasting models, to provide basic terms and concepts. Next is a discussion of unique features of crime space-time series data, and the need for data pooling to handle small-area model estimation problems. Lastly are a discussion of particular forecasting requirements of police and a summary.

Gorr, Wilpen
Szczypula, Janusz
Working Paper1999
How Large Should the Strike Zone Be in "Three Strikes and You're Out" Sentencing Laws?

So-called "three strikes and you’re out" sentencing laws for criminal offenders have proliferated in the United States in the 1990s. The laws vary considerably in their definitions of what constitutes a "strike". This paper adapts the classic Poisson Process model of criminal offending to investigate how varying sentence lengths and definitions of what constitutes a strike affect the effectiveness and cost-effectiveness of these sentencing laws. In particular, it asks whether by using different definitions for the first, second, and third strikes or different sentence lengths, one can make the resulting incarceration more "efficient" in the sense of incapacitating more crimes per cell-year served.
 

Caulkins, Jonathan
Working Paper1999
Labor Market Discrimination

Two central and related questions in economics concern how resources are distributed and why some persons earn more than others. In the labor market, where the buyers are employers and the sellers are workers, exchange occurs when workers "sell" their labor to employers and receive wages or salary in return for the services that they perform. In the United States, economic status varies remarkably along race and gender lines. This paper describes racial differences in three distinct but related dimensions of economic status: earnings, employment status, and wealth. Finally, this paper reviews some explanations economists have offered for these differences and consider the evidence available to support or refute these explanations.

McElroy, Susan
Working Paper1999
Quality Certification and the Economics of Contract Software Development: A Study of the Indian Software Industry

A significant amount of software development is being outsourced to countries such as India. Many Indian software firms have applied for and received quality certifications like the ISO9001, and the number of quality certified software firms has steadily increased. Despite its growing popularity among Indian software developers, there is very little systematic evidence on the relationship of ISO certification to organizational performance. Using data on 95 Indian software firms and drawing upon site visits and interviews with Indian software firms and their US clients, a stylized model is developed of a firm that develops software for others to articulate the different ways in which ISO certification can affect firm profits. It can be concluded that ISO certification enhances firm growth. The results provide partial support for the proposition that ISO certification also enhances revenue for a given size, suggesting that firms are receiving a higher price per unit of output. In turn, this is consistent with the notion that ISO certification also enhances the quality of output. The field studies confirm that although most firms see ISO certification as a marketing ploy, some of them do proceed to institute more systematic and better-defined processes for software development.

Arora, Ashish
Working Paper1999
The Black Male and the United States Economy

This paper examines the current status of black males in the United States economy and emphasizes several positive aspects of the changing status of black males over time. While it is  acknowledged that the social and economic conditions of black males in the United States are troubling in many respects, the objective is to highlight the progress and achievements of black males. Most research on black males focuses solely on the problems and rarely highlights the successes and accomplishments. However, little attention has been paid to the achievements in education, the professional successes, and the positive community and family involvement of black males in the United States.
 

McElroy, Susan
Working Paper1999
The Impact of State Enterprise Zones on Urban Housing Markets

State-sponsored enterprise zones are a major economic development tool for over forty states. In many cases, the impact of these geographically targeted tax-based policies has never been evaluated. One roadblock to evaluation is the difficulty of controlling for the persistent effect of distressed local economic conditions while searching for possible programmatic effects. This paper examines the impact of enterprise zone programs on a variety of local economic growth rates in Florida, Pennsylvania and Virginia. The paper focuses on housing prices because they capitalize expected long run changes in prosperity. It is found that the states established zones in very distressed areas. The housing markets in zone areas are surprisingly strong following zone designation, in spite of continued weak income and employment growth. Estimates that distinguish the programmatic impact from differences arising due to pre-designation conditions indicate that two of the programs stimulated housing demand as measured by increases in home ownership and occupancy rates. However, the programs’ negative impact on labor market conditions offset the increased demand, causing housing prices to remain stable.

Engberg, John
Working Paper1999
The Indian Software Services Industry: Structure and Prospects

This paper reports on the results of research on the Indian software industry, carried out at Carnegie Mellon University. This research uses a variety of sources, including a questionnaire survey of Indian software firms, and field visits and interviews with industry participants, observers, and US based clients. The Indian software industry is remarkable in a number of respects. It is service rather than product oriented, heavily export oriented, and is largely managed by professional and entrepreneurial managements. Also, domestic market experience and expertise appears to have very little benefits for successful importers. Although the industry has grown in spectacular fashion, sustaining this performance will pose a number of challenges. In order to counteract the widely reported shortages of skilled software professionals and the possible competition from other low wage, human capital rich countries, Indian firms are trying to move up the value chain by acquiring deeper knowledge of business domains and management capability, and to reduce costs by developing superior methodologies and tools. Whether and how many firms will be a key test of the management skills and willingness to invest along a number of dimensions. From a social perspective, the disconnect between domestic and export markets is a major challenge, but one that the growing diffusion of computers and the improvement of the communication infrastructure should make easier to confront. In the end, the greatest impact the software industry is likely to have on the Indian economy is indirect, in its role as an exemplar of the new business organisational form and as an inspiration to other entrepreneurs.
 

Arora, Ashish
Working Paper1999
A Computational Model of Internal Control Testing Plan Selection

A Computational Model of Internal Control Testing Plan Selection
 

Peters, James
Working Paper1998
Are AFDC and SSI Substitutes?

The passage of the Personal Responsibility and Work Opportunity Reconciliation Act of 1996 for the first time placed strict limits on the amount of support families could receive from the Aid to Families with Dependent Children program (AFDC). Generally, researchers and policy makers have both assumed that any substitute support that does arise will be from non-governmental sources – women will either find work, receive support from family members, or be aided by local religious organizations or other private charities. We investigate the potential for one government program, the Social Security Insurance (SSI) program, to simply substitute for the reductions in the AFDC program. We find strong evidence that AFDC and SSI are substitutes. This suggests that at least part of the effect of welfare reform will be to shift the burden of support for poor families from one government program to another rather than from governmental to non-governmental sources.
 

Sanders, Seth
Working Paper1998
Building Relationships Around Tasks: Psychological Contracts in Faculty-Doctoral Student Collaborations

Psychological contracts theory is applied to the study of faculty-doctoral student collaborations. Through a survey of 170 doctoral students, four types of psychological contracts are investigated. The quality of collaboration and frequency of meetings are found to differ significantly across these contract types. In addition, quality of collaboration and meeting frequency varied significantly across collaborations using different research methods (e.g, laboratory work, theory building) and disciplinary paradigms (i.e., high and low consensus). A comparison sample of 46 faculty from the same departments supported several trends observed in the doctoral student data.
 

Rousseau, Denise
Working Paper1998
Computer Technology, Human Labor, and Long-Run Economic Growth

Over the coming century, computer technology is likely to become capable of reproducing many of the skills now performed by human labor. This paper describes three models of the aggregate economic changes that occur when capital becomes capable of performing human work skills. The basic model, with a single sector and homogeneous labor, projects output growth rates over the next few decades that are substantially above historical growth rates in industrialized countries, assuming plausible increases in computer skill. The projected output growth is accompanied by structural changes reflecting the reduced role of labor, with wage growth lagging output growth and the labor share of output decreasing. Resource limits do not substantially affect the levels of output and wage growth in the near future. The 2-type model, with fixed skill differences between different workers, produces similar growth in output and average wages over the next several decades. However, the worker skill differences produce large increases in wage inequality between types of workers. The 2-sector model, with different skill requirements for different economic sectors, also produces similar growth in output and wages over the next several decades. For the three models, asymptotic growth in output and wages is substantially reduced by resource limits, worker skill differences, and sector skill differences, even though those constraints do not substantially reduce growth over the next few decades. The models produce patterns of change in the labor share and capital-output ratio that are consistent with broad trends in economic data.
 

Elliott, Stuart
Working Paper1998
Does Managed Care Matter? Hospital Utilization in the U. S. between 1985 and 1993

A study on the impact of HMOs on Hospital Utilization.
 

Gaynor, Martin
Working Paper1998
Factors Relevant to Senior Information Systems Managers' Decisions to Adopt New Computing Paradigms: An Exploratory Study

Factors Relevant to Senior Information Systems Managers' Decisions to Adopt New Computing Paradigms: An Exploratory Study
 

Bajaj, Akhilesh
Working Paper1998
Forecasting Analogous Time Series

Organizations that use time series forecasting on a regular basis generally forecast many variables, such as demand for many products or services. Within the population of variables forecasted by an organization, we can expect that there will be groups of analogous time series that follow similar, time-based patterns. The co-variation of analogous time series is a largely untapped source of information that can improve forecast accuracy (and explainability). This paper takes the Bayesian pooling approach to drawing information from analogous time series to model and forecast a given time series. Bayesian pooling uses data from analogous time series as multiple observations per time period in a group-level model. It then combines estimated parameters of the group model with conventional time series model parameters, using "shrinkage" weights estimated empirically from the data. Major benefits of this approach are that it 1) minimizes the number of parameters to be estimated (many other pooling approaches suffer from too many parameters to estimate), 2) builds on conventional time series models already familiar to forecasters, and 3) combines time series and cross-sectional perspectives in flexible and effective ways.
 

Duncan, George
Gorr, Wilpen
Szczypula, Janusz
Working Paper1998
How Effective is Micro Harm Reduction in Reducing Macro Harm?

MacCoun (1996) distinguishes between micro and macro harm reduction and notes that reducing micro harm (harm per unit of use) may or may not reduce macro (aggregate) harm depending on its effect on use. We present a simple model that relates micro and macro harm through five parameters: price, quantity, elasticity of demand, elasticity of supply, and the social cost of drug use. Parameterizing the relationship for the US cocaine market in 1992 suggests that about 75% of the apparent benefit of reducing micro harm experienced by users would be offset by increases in use. This suggests that reducing micro harm experienced by users has merit but that reducing the costs drugs impose on non-users may merit greater attention, since reducing those costs carries no risk of being offset by increases in use.
 

Caulkins, Jonathan
Working Paper1998
How Should Low-Level Drug Dealers Be Punished?

The US pursues a number of drug control strategies, but the it invests the most resources in arresting, prosecuting, and incarcerating low-level drug dealers. Thus, it is important to reflect on what is the appropriate and expedient punishment for these offenders. Currently punishments vary from nothing to very long prison sentences; substantial variation is appropriate because not all low-level dealers are equally destructive. Unfortunately the current system does not punish most severely the most culpable offenders. A stronger correlation between severity of sanction and culpability could be achieved by: (1) moving decisions concerning length of incarceration from the state level to the local level, (2) reducing minimum sanction severity to expand the variation between minimum and maximum sanctions for all defendants except those who meet locally established definitions of what constitutes unusually destructive forms of dealing, and (3) allowing judges to depart from presumptive sentences instead of computing sentence length from fixed formulas based on readily observable - but only marginally relevant - criteria such as quantity possessed. The goal would be to allow police, prosecutors, and judges to work together to identify and target long sentences on the minority of most vicious dealers. This would serve the interests of justice, by making the punishment better fit the crime, and efficiency, by making more effective use of scarce and expensive punishment capacity.
 

Caulkins, Jonathan
Working Paper1998
Licensing in the chemical industry

A patent confers on the patentee the right to exclude others from the use of the knowledge that the patent covers. Patents, however, are not the only feasible way to reach exclusiveness and other economic means might be used as well. Indeed, the alternatives are often thought to be more effective at enabling the inventor to benefit from the innovation than patenting itself (Levin et al., 1987; Cohen et al., 1996). The instrument of exclusion is, however, not a matter of indifference for society. The way in which patents are used (or not used) affects the evolution of the industrial structure and the technology itself. Specifically, unlike such alternatives as lead time, first mover advantage and secrecy, patents can be used to sell technology, typically through license contracts. Simply put, patents can play a key role in facilitating the purchase and sale of technology. This essay moves beyond the traditional approach to patents that has mainly focused on patents as means to exclude others and highlights the role of the market for technology. A market for technology not only helps diffuse existing technology more efficiently, it also enables firms to specialize in the generation of new technology. In turn, such specialization is likely to hasten the pace of technological change itself. However, the development of a market for technology is not an automatic outcome, and depends upon a number of factors that include the strength of patent rights, as well as the nature of the technology and the industry structure itself.

Arora, Ashish
Working Paper1998
Licensing in the presence of competing technologies

In technology-based industries, many incumbent firms license their technology to other firms that will potentially compete with them. Such a strategy is difficult to explain within traditional models of licensing. This paper extends the literature on licensing by relaxing the widespread assumption of a ‘unique’ technology holder. A model is developed with many technological trajectories for the production of a differentiated good. We find that competition in the market for technology induces licensing of innovations, and that the number of licenses can be inefficiently large. A strong testable implication of the theory is that the number of licenses per patent holder decreases with the degree of product differentiation.
 

Arora, Ashish
Working Paper1998
Managing Information Privacy and Information Access in the Public Sector

Government agencies collect and disseminate data that bear on the most important issues of public interest. Advances in information technology, particularly the Internet, have multiplied the tension between demands for evermore comprehensive databases and demands for the shelter of privacy. In mediating between these two conflicting demands, agencies must address a host of difficult problems. These include providing access to information while protecting confidentiality, coping with health information databases, and ensuring consistency with international standards. The policies of agencies are determined by what is right for them to do, what works for them, and what they are required to do by law. They must interpret and respect the ethical imperatives of democratic accountability, constitutional empowerment, and individual autonomy. They must keep pace with technological developments by developing effective measures for making information available to a broad range of users. They must both abide by the mandates of legislation and participate in the process of developing new legislation that is responsive to changes that affect their domain. In managing confidentiality and data access functions, agencies have two basic tools: techniques for disclosure limitation through restricted data and administrative procedures through restricted access. The technical procedures for disclosure limitation involve a range of mathematical and statistical tools. The administrative procedures can be implemented through a variety of institutional mechanisms, ranging from privacy advocates, through internal privacy review boards, to a data and access protection commission

Duncan, George
Working Paper1998
Obtaining Information while Preserving Privacy: A Markov Perturbation Method for Tabular Data

Preserving privacy appears to conflict with providing information. Statistical information can, however, be provided while preserving a specified level of confidentiality protection. The general approach is to provide disclosure-limited data that maximizes its statistical utility subject to confidentiality constraints. Disclosure limitation based on Markov chain methods that respect the underlying uncertainty in real data is examined. For use with categorical data tables a method called Markov perturbation is proposed as an extension of the PRAM method of Kooiman, Willenborg, and Gouweleeuw (1997). Markov perturbation allows cross-classified marginal totals to be maintained and promises to provide more information than the commonly used cell suppression technique.
 

Duncan, George
Working Paper1998
Optimal Disclosure Limitation Strategy in Statistical Databases: Deterring Tracker Attacks Through Additive Noise

Disclosure limitation methods transform statistical databases to protect confidentiality. A statistical database responds to queries with aggregate statistics. The database administrator should maximize legitimate data access while keeping the risk of disclosure below an acceptable level. Legitimate users seek statistical information, generally in aggregate form; malicious users-the data snoopers-attempt to infer confidential information about an individual data subject. Tracker attacks are of special concern for databases accessed online. This article derives optimal disclosure limitation strategies under tracker attacks for the important case of data masking through additive noise. Operational measures of the utility of data access and of disclosure risk are developed. The utility of data access is expressed so that tradeoffs can be made between the quantity and the quality of data to be released. The article shows that an attack by a data snooper is better thwarted by a combination of query restriction and data masking than by either disclosure limitation method separately. Data masking by independent noise addition and data perturbation are considered as extreme cases in the continuum of data masking using positively correlated additive noise. Optimal strategies are established for the data snooper. Circumstances are determined under which adding autocorrelated noise is preferable to using existing methods of either independent noise addition or data perturbation. Both moving average and autoregressive noise addition is considered.

Duncan, George
Working Paper1998
R&D, Knowledge Spillovers, and Competition among Firms with Asymmetric Technological Capabilities

R&D, Knowledge Spillovers, and Competition among Firms with Asymmetric Technological Capabilities
 

Arora, Ashish
Vogt, William
Working Paper1998
SEAM: A State-Entity-Activity-Model for a Well-Defined Workflow Development Methodology

SEAM: A State-Entity-Activity-Model for a Well-Defined Workflow Development Methodology
 

Working Paper1998
Determinants of Information Technology Outsourcing Among Health Maintenance Organizations

This paper extends transaction cost economics by examining the effect of relaxing two of its underlying assumptions. First, transaction cost economics relies on an assumption of risk neutrality. This paper argues that organizations transactions vary in the risk they impose on an organization and that organizations are more likely to embed riskier transactions within a hierarchy. Second, transaction cost economics assumes that transactions are independently organized. Organizations have an underlying propensity to organize transactions through hierarchy or contracting and that this underlying propensity is related to an organization’s capabilities, such as absorptive capacity. The analysis shows that transaction organization is a function of transaction risk. Transaction risk, rather than uncertainty or firm asset specificity, is the most important factor determining transaction organization. And, the analysis shows that transaction organization is a function of an organization’s absorptive capacity and technological diversity. This means that transactions within an organization are interdependent.

Padman, Rema
Working Paper1997
Division of Labor and the Transmission of Growth

This paper studies how an independent upstream capital good sector in a technology based industry can act as a mechanism for the transmission of growth across countries. Technologies, once developed, can be ‘transferred’ to other countries at low incremental cost. If there are upstream firms which specialize in providing technology and engineering services to downstream buyer firms, then the greater the number of such specialists, the greater the net surplus that buyers get. Since the number of specialists is determined by the size of the downstream sector, the growth of the downstream sector in leading countries (first world) has beneficial effects for the growth of the downstream sector in follower countries (less developed countries). We empirically test this proposition using a comprehensive data set of investments in chemical plants in the developing countries during the 1980s. We find that one additional specialized supplier in a given process technology would have increased the expected investment in LDCs by $100 million to $200 million, with the increases greater in more mature technologies, and for larger LDCs.

Arora, Ashish
Working Paper1997
Enter at your own risk: HMO participation and enrollment in the Medicare risk Market

This paper examines HMO participation and enrollment in the Medicare risk market for the years 1990 to 1995. A profit-maximization model of HMO behavior is developed, which explicitly considers potential linkages between an HMO’s production decision in the commercial enrollee market and its participation and production decisions in the Medicare risk market. The results suggest that the AAPCC is a primary determinant of HMO participation, while the price of a supplemental Medicare insurance policy positively affects HMO Medicare enrollment. This paper also finds empirical support for the existence of complementarities in the joint production of an HMO’s commercial and Medicare products.

Arora, Ashish
Gaynor, Martin
Working Paper1997
Firm Size and Capabilities; Regional Agglomeration and the Adoption of New Technology

The literature on agglomeration economies suggests that, in addition to firmspecific attributes, the local geographic context conditions the expected profitability of technology adoption. All theories of technology diffusion assume that inter-firm learning is the outcome of contact with prior adopters. Yet, with few exceptions, the attributes of location that maximize the opportunities for learning (and hence, reduce the costs of technology adoption for all firms in the same locale) have been given only cursory treatment. This paper develops and tests a model in which both firm-specific capabilities and place-specific external economies affect the firm’s decision to adopt a new technology. The data come from two national surveys conducted in 1987 and 1991. Because there is available information on two different time periods, it is possible to specify firm and place-specific conditions that precede the technology adoption decision. It is found that localization (as measured by regional clustering of enterprises in related industries) and urbanization (as measured by the diversity of industries, and by the concentration of degree granting engineering institutions) provide knowledge spillovers that facilitate the adoption of new technology by local establishments. Moreover, the impact of urbanization economies is size-related: The impact of a diverse region on adoption is even greater for small enterprises than for large ones.
 

Kelley, Maryellen
Working Paper1997
From Mission to Commercial Orientation: Perils and Possibilities for Federal Industrial Technology Policy

From Mission to Commercial Orientation: Perils and Possibilities for Federal Industrial Technology Policy

Kelley, Maryellen
Working Paper1997
Increasing Consolidation in Health Care Markets: What are the Antitrust Policy Implications?

Increasing Consolidation in Health Care Markets: What are the Antitrust Policy Implications?

Gaynor, Martin
Working Paper1997
Information Technology, Organizational Capabilities and Context: How do regional economies and institutions matter to the spread of innovation?

Information Technology, Organizational Capabilities and Context: How do regional economies and institutions matter to the spread of innovation?

Kelley, Maryellen
Working Paper1997
On Heterogeneous Database Retrieval: A Cognitively-Guided Approach

Retrieving information from heterogeneous database systems involves a complex process and remains a challenging research area. This paper proposes a cognitively-guided approach for developing an information retrieval agent that takes the user’s information request, identifies relevant information sources, and generates a multidatabase access plan. The work is distinctive in that agent design is based on an empirical study of how human experts retrieve information from multiple, heterogeneous database systems. To improve on empirically observed information retrieval capabilities, the design incorporates mathematical models and algorithmic components. These components optimize the set of information sources that need to be considered to respond to a user query and are used to develop efficient multidatabase access plans. This agent design which integrates cognitive and mathematical models has been implemented using the Soar architecture.

Applicable, Not
Working Paper1997
Physicia Networks and Their Implications for Competition in Health Care Markets

The physician market is being transformed in dramatic ways. One of the most notable areas of change has been tremendous growth in physician networks, such as independent practice associations (IPAs). As of August 1996, there were approximately 4,000 IPAs with an average of 300 physicians each, up from approximately 1,500 in 1990. Physician networks are made up of otherwise independent physicians that join together to market themselves collectively to health insurers, and in some cases, directly to employers. Normally, independent competitors are not allowed to set prices jointly. The key question here is whether these networks represent an efficient response to the changing structure of health care markets or strategic attempts to increase market power.

Gaynor, Martin
Working Paper1997
Spatio-Temporal Forecasting of Crime: Application of Classical and Neural Network Methods

This paper introduces a new spatio-temporal forecasting methodology that combines artificial neural networks and cellular automata with GIS-based data. The technique, which we refer to as chaotic cellular forecasting (CCF) is similar to spatial adaptive filtering due to Foster and Gorr (1986) and weighted spatial adaptive filtering due to Gorr and Olligschlaeger (1994) in that it uses contiguity relationships and the geographer’s assumption that influence between data points decays with distance. As with spatial adaptive filtering the methodology uses an iterative process to arrive at a solution. Unlike spatial adaptive filtering, however, chaotic forecasting uses a gradient descent method rather than a grid search to find the optimal set of parameters (or, in the case of artificial neural networks, weights). In addition, and most importantly, CCF has the nonlinear and multi-model functional form commonly used in neural net modeling, allowing for increased pattern recognition and accommodation of spatio-temporal heterogeneity. The result is a robust spatio-temporal forecasting method that requires very little model specification, is self - adaptive and performs very well on data sets that exhibit non-traditional statistical behavior.

Gorr, Wilpen
Working Paper1997
The Impact of NSF Support For Basic Research in Economics

This paper studies an unusually rich data set of all the 1473 applications to the NSF in economics during 1985-1990. This dataset is unusual in that one can observe the characteristics of the scientists whose application was accepted (414 applications in our sample) as well as of those whose applications were rejected. This paper estimates the effect of an NSF grants on the research output (quality-adjusted publications) of individual researchers. Estimates indicate that NSF funding has only a modest effect on scientific publication output, with the possible exception of junior scholars. The paper also analyses some related issues, such as the factors that affect the NSF selection process and the decision about the size of the grants.
 

Arora, Ashish
Working Paper1997
The Sociology of Groups and The Economics of Incentives: Theory and Evidence on Compensation Systems

This paper incorporates the sociological concept of "group norms" into an economic analysis of pay systems. We use a behavioral microeconomic model and a unique survey of medical groups to examine the theoretical and empirical relationship between group norms and incentive pay. Our findings suggest that, at least for medical groups, norms are binding constraints in the choice of pay practices. While group norms matter, the patterns in the data suggest that they are not all that matters. Analysis of the preferences and activities of individual physicians indicate that factors highlighted by the economic theory of agency, notably income insurance and multi-task considerations, also shape pay policies. The conclusion from these results is that the sociological concept of group norms augments rather than replaces more conventional economic analyses of pay practices.

Gaynor, Martin
Working Paper1997
Why Do Firms Adopt Green Design?: Organizational Opportunity, Organizational Resources, Costs, or Regulation

This paper evaluates four explanations for industrial facilities' incorporation of environmental considerations in their product designs (i.e., "green design"): organizational opportunity, organizational resources, cost reduction, and environmental regulatory pressure. Initial analyses of facility-level workplace practice and environmental data support all four explanations. Facilities vigorously practicing cost reduction, with greater opportunities and resource bases to engage in green design, and facing more intense environmental regulatory pressure are substantially more likely to adopt green design. The results of multivariate analyses, however, indicate that adoption of green design is primarily determined by organizational opportunities and resources.

Florida, Richard
Working Paper1997
Adjusting GPA to Reflect Course Difficulty

The computation of Graduate Point Average (GPA) incorrectly assumes that grades are comparable across courses and instructors. GPA overstates the performance of students who elect an "easier" course of study relative to those who choose a more "difficult" course of study. This paper proposes a method of adjusting GPA and applies it to data from one cohort of undergraduates at Carnegie Mellon University. Adjusted GPAs are more highly correlated with students’ high school Grade Point Average and with SAT scores than are the raw GPAs or GPAs adjusted using a prominent alternative method, Item Response Theory. A survey of students finds that the new methods’ estimates of relative course difficulty are consistent with students’ perceptions of relative course difficulty.
 

Caulkins, Jonathan
Larkey, Patrick
Working Paper1996
Insurance, Vertical Restraints and Competition

Insurance, Vertical Restraints and Competition

Gaynor, Martin
Working Paper1996
Physician Contracting with Health Plans, A Survey of the Literature

Physician Contracting with Health Plans, A Survey of the Literature

Gaynor, Martin
Working Paper1996
Returns to Specialization, Transaction Costs, and the Dynamics of Industry Evolution

Returns to Specialization, Transaction Costs, and the Dynamics of Industry Evolution

Arora, Ashish
Working Paper1996
A Structural Perspective on Organizational Cognitions: Attributions of Power, Performance, and Attitudes

A Structural Perspective on Organizational Cognitions: Attributions of Power, Performance, and Attitudes

Carley, Kathleen
Krackhardt, David
Working Paper1995
Estimating Elasticities of Demand for Cocaine and Heroin with Data from the Drug Use Forecasting System

Estimating Elasticities of Demand for Cocaine and Heroin with Data from the Drug Use Forecasting System

Caulkins, Jonathan
Working Paper1995
Quantifying Passive Use Values From "Faint" Behavioral Trails: Television News Viewing and the Exxon Valdez

Observable actions in response to decreases in environmental quality are identified for passive users, those who will never use a damaged resource directly. The welfare implication of changes in the probability and value of time devoted to viewing news of the Valdez oil spill is estimated using a household production approach and assuming weak substitutability between news consumption and environmental protection. The implied change in welfare from viewing the news, expected to be a large component of passive use value, ranges from $12 to 17 million 1989 dollars. The approach also provides a basis for a conditioning element in the study of other observable behavior and for more general application to the observable costs of major news events.

Farrow, Scott
Working Paper1995
Benchmarking Economic Development: Regional Strategy in Silicon Valley, Austin, Seattle, Oregon and Cleveland.

This paper explores "best-practice" regional economic development strategies in cities and regions throughout the United States. Benchmarking case studies were performed on Silicon Valley, Austin, Seattle, Oregon, and Cleveland consisting of field research, site visits and extensive personal interviews. While some of these regional efforts are quite new, and have not advanced beyond the formative stage, together they provide a good example of the cutting-edge approaches to regional economic development.

Florida, Richard
Gleeson, Robert
Working Paper1994
Capital and Creative Destruction: Venture Capital, Technological Change, and Economic Development

Capital and Creative Destruction: Venture Capital, Technological Change, and Economic Development

Florida, Richard
Working Paper1994
Choosing Among Fuels and Technologies for Cleaning Up the Air

Choosing Among Fuels and Technologies for Cleaning Up the Air

Hahn, Robert
Working Paper1994
Comparative Study of Cross Sectional Methods for Time Series with Structural Changes

Comparative Study of Cross Sectional Methods for Time Series with Structural Changes

Duncan, George
Gorr, Wilpen
Szczypula, Janusz
Working Paper1994
Do Stock Prices Follow Random Walks: An Analysis of the Tokyo Stock Exchange

Do Stock Prices Follow Random Walks: An Analysis of the Tokyo Stock Exchange
 

Working Paper1994
Dyadic and Demographic Strategies in Cross-Cultural Management: A Test of the Effect of Complementary Practices in MNC Performance

Dyadic and Demographic Strategies in Cross-Cultural Management: A Test of the Effect of Complementary Practices in MNC Performance

Appold, Stephen
Working Paper1994
Employee Stock Ownership Plans and Financial Performance in American Firms

Employee Stock Ownership Plans and Financial Performance in American Firms

Working Paper1994
Endogenous Preferences: A Structural Approach

Endogenous Preferences: A Structural Approach

Krackhardt, David
Working Paper1994
Incompletely Specified Probabilistic Networks

Incompletely Specified Probabilistic Networks

Roehrig, Stephen
Working Paper1994
Intuitive Time-Series Extrapolation of Sales

Intuitive Time-Series Extrapolation of Sales

Peters, James
Working Paper1994
ONDCP's First Four Years as a Policy Agency

ONDCP's First Four Years as a Policy Agency

Caulkins, Jonathan
Working Paper1994
Optimum Population -- A New Look: A Hamiltonian Approach Towards a Dynamic Analysis

Optimum Population -- A New Look: A Hamiltonian Approach Towards a Dynamic Analysis

Working Paper1994
The Effect of Culture and Staffing Patterns on Firm Performance: An Investigation of MNEs in Thailand

The Effect of Culture and Staffing Patterns on Firm Performance: An Investigation of MNEs in Thailand

Appold, Stephen
Working Paper1994
The Employment of Women Managers and Professionals in Thailand

The Employment of Women Managers and Professionals in Thailand

Appold, Stephen
Working Paper1994
Toward a Shared Economic Vision for Pittsburgh a Southwestern Pennsylvania: A White Paper Update

Toward a Shared Economic Vision for Pittsburgh a Southwestern Pennsylvania: A White Paper Update

Gleeson, Robert
Working Paper1994

Back to Top...