Newton, H. Joseph 1949
Overview
Works:  76 works in 138 publications in 2 languages and 590 library holdings 

Genres:  Handbooks and manuals Academic theses Conference papers and proceedings 
Roles:  Author, Editor, Contributor 
Classifications:  QA276.4, 005.55 
Publication Timeline
.
Most widely held works by
H. Joseph Newton
TIMESLAB : a time series analysis laboratory by
H. Joseph Newton(
Book
)
16 editions published in 1988 in 3 languages and held by 105 WorldCat member libraries worldwide
16 editions published in 1988 in 3 languages and held by 105 WorldCat member libraries worldwide
The elements of statistics : with applications to economics and the social sciences by
James Bernard Ramsey(
Book
)
5 editions published in 2002 in English and held by 98 WorldCat member libraries worldwide
5 editions published in 2002 in English and held by 98 WorldCat member libraries worldwide
One hundred nineteen stata tips by
N. J Cox(
Book
)
9 editions published in 2014 in English and Undetermined and held by 76 WorldCat member libraries worldwide
Provides concise and insightful notes about commands, features, and tricks that will help you obtain a deeper understanding of Stata. The book comprises the contributions of the Stata community that have appeared in the "Stata Journal" since 2003
9 editions published in 2014 in English and Undetermined and held by 76 WorldCat member libraries worldwide
Provides concise and insightful notes about commands, features, and tricks that will help you obtain a deeper understanding of Stata. The book comprises the contributions of the Stata community that have appeared in the "Stata Journal" since 2003
Seventysix Stata tips(
Book
)
12 editions published in 2009 in English and held by 70 WorldCat member libraries worldwide
Since 2003, "The Stata Journal" has included Stata Tips on special issues in data analysis with Stata. "Seventysix Stata Tips, 2nd Edition" compiles these useful guides into a compact tome for ease of reference. In keeping with the Stata spirit, tips are from Stata users and StataCorp employees alike and will serve as guideposts for both new and experienced users. "Seventysix Stata Tips" includes the first 33 tips of the series, previously published in the book "Thirtythree Stata Tips"
12 editions published in 2009 in English and held by 70 WorldCat member libraries worldwide
Since 2003, "The Stata Journal" has included Stata Tips on special issues in data analysis with Stata. "Seventysix Stata Tips, 2nd Edition" compiles these useful guides into a compact tome for ease of reference. In keeping with the Stata spirit, tips are from Stata users and StataCorp employees alike and will serve as guideposts for both new and experienced users. "Seventysix Stata Tips" includes the first 33 tips of the series, previously published in the book "Thirtythree Stata Tips"
Thirtythree Stata tips(
Book
)
7 editions published in 2006 in English and held by 54 WorldCat member libraries worldwide
7 editions published in 2006 in English and held by 54 WorldCat member libraries worldwide
StatConcepts : a visual tour of statistical ideas by
H. Joseph Newton(
Book
)
6 editions published in 1997 in English and held by 54 WorldCat member libraries worldwide
"STATCONCEPTS gives students an interactive, visual tour of 18 important statistical concepts. An essential companion to any introductory statistics text, it offers an easier, nonmathematical way to help your students understand statistical concepts. Effectively designed to be used with its accompanying software, the book is organized into 28 digestible labs that offer immediate visual access to concepts, terminology, and results."Amazon
6 editions published in 1997 in English and held by 54 WorldCat member libraries worldwide
"STATCONCEPTS gives students an interactive, visual tour of 18 important statistical concepts. An essential companion to any introductory statistics text, it offers an easier, nonmathematical way to help your students understand statistical concepts. Effectively designed to be used with its accompanying software, the book is organized into 28 digestible labs that offer immediate visual access to concepts, terminology, and results."Amazon
Computing science and statistics : graphics and visualization by Symposium on the Interface(
Book
)
1 edition published in 1992 in English and held by 13 WorldCat member libraries worldwide
1 edition published in 1992 in English and held by 13 WorldCat member libraries worldwide
Metaanalysis in Stata : an updated collection from the Stata journal by
Tom M Palmer(
Book
)
6 editions published between 2009 and 2016 in English and held by 10 WorldCat member libraries worldwide
This collection provides detailed descriptions of both standard and advanced metaanalytic methods and their implementation in Stata. Readers will gain access to the statistical methods behind the rapid increase in the number of metaanalyses reported in the social science and medical literature. The book shows how to conduct and interpret metaanalyses as well as produce highly flexible graphical displays. Using metaregression, it examines reasons for betweenstudy variability in effect estimates. The book also employs advanced methods for the metaanalysis of diagnostic test accuracy studies, doseresponse metaanalysis, metaanalysis with missing data, and multivariate metaanalysis
6 editions published between 2009 and 2016 in English and held by 10 WorldCat member libraries worldwide
This collection provides detailed descriptions of both standard and advanced metaanalytic methods and their implementation in Stata. Readers will gain access to the statistical methods behind the rapid increase in the number of metaanalyses reported in the social science and medical literature. The book shows how to conduct and interpret metaanalyses as well as produce highly flexible graphical displays. Using metaregression, it examines reasons for betweenstudy variability in effect estimates. The book also employs advanced methods for the metaanalysis of diagnostic test accuracy studies, doseresponse metaanalysis, metaanalysis with missing data, and multivariate metaanalysis
An intelligent decision making system for detecting high impedance faults by Chang Jong Kim(
Book
)
1 edition published in 1989 in English and held by 4 WorldCat member libraries worldwide
The clearing of distribution line faults is usually accomplished by devices which can sense the overcurrent produced by a fault and react to disconnect the faulted section of the feeder from the healthy section. However, high impedance faults do not draw sufficient fault current to be detected by such a conventional protective scheme. Such faults may be caused by a conductor on the ground. Arcing is usually associated with these faults, which may result in a fire hazard. The harmonic currents characterized by an arc are variable, transitory, and random in their behavior. The relative amplitude increase of harmonic currents is very large in some high impedance faults, but sometimes it is very low, and other times very similar to the level of the normal state. While a few techniques to detect high impedance faults have been proposed, and some progress has been made, a complete solution has not been found. This research concentrates on designing an intelligent decision making system which uses multiple detection techniques incorporated with an appropriate detection reasoning method and a learning ability to provide a more effective solution for high impedance fault detection. Major parts of this system are a technique selection, a technique combination, and an induction process. The method of decision making under incomplete knowledge is used to select appropriate techniques because the information on the performance of techniques are available but not complete. With these selected techniques, the DempsterShafer theory is adopted to find a final belief about the system status by combining the belief from each technique. Inductive reasoning with minimum entropy is applied to find decision rules and thus to adjust the technique selection process. A learning detection system which combines all three major parts is proposed to realize this intelligent decision making system. The learning detection system synthesizes the final belief of the combined techniques, the status output of a decision tree from the inductive reasoning process, and an event detector output to detect and identify the system status. The intelligent decision making system makes a smart decision on an example execution with a complex test set of sample data
1 edition published in 1989 in English and held by 4 WorldCat member libraries worldwide
The clearing of distribution line faults is usually accomplished by devices which can sense the overcurrent produced by a fault and react to disconnect the faulted section of the feeder from the healthy section. However, high impedance faults do not draw sufficient fault current to be detected by such a conventional protective scheme. Such faults may be caused by a conductor on the ground. Arcing is usually associated with these faults, which may result in a fire hazard. The harmonic currents characterized by an arc are variable, transitory, and random in their behavior. The relative amplitude increase of harmonic currents is very large in some high impedance faults, but sometimes it is very low, and other times very similar to the level of the normal state. While a few techniques to detect high impedance faults have been proposed, and some progress has been made, a complete solution has not been found. This research concentrates on designing an intelligent decision making system which uses multiple detection techniques incorporated with an appropriate detection reasoning method and a learning ability to provide a more effective solution for high impedance fault detection. Major parts of this system are a technique selection, a technique combination, and an induction process. The method of decision making under incomplete knowledge is used to select appropriate techniques because the information on the performance of techniques are available but not complete. With these selected techniques, the DempsterShafer theory is adopted to find a final belief about the system status by combining the belief from each technique. Inductive reasoning with minimum entropy is applied to find decision rules and thus to adjust the technique selection process. A learning detection system which combines all three major parts is proposed to realize this intelligent decision making system. The learning detection system synthesizes the final belief of the combined techniques, the status output of a decision tree from the inductive reasoning process, and an event detector output to detect and identify the system status. The intelligent decision making system makes a smart decision on an example execution with a complex test set of sample data
Graphics and visualization : Computing science and statistics by Symposium on the Interface: Computing Sciences and Statistics(
Book
)
2 editions published in 1992 in English and held by 4 WorldCat member libraries worldwide
2 editions published in 1992 in English and held by 4 WorldCat member libraries worldwide
The elements of statistics with applications to economics and the social sciences by
James Bernard Ramsey(
)
1 edition published in 2002 in English and held by 3 WorldCat member libraries worldwide
1 edition published in 2002 in English and held by 3 WorldCat member libraries worldwide
Some results in autoregressive modeling by
Katherine Bennett Ensor(
Book
)
1 edition published in 1986 in English and held by 3 WorldCat member libraries worldwide
Least squares estimates are not as widely used as YuleWalker estimates when modeling autoregressive processes mainly because there is a wellknown algorithm, Levinson's Algorithm, which may be used to obtain the YuleWalker estimates for orders 1, . . ., M recursively but there is not a wellknown recursive algorithm for the least squares estimates. A recursive algorithm is presented to obtain the least squares estimates for orders 1 ..., M. In addition to the recursive algorithm, the effect of estimating the order of the process on the estimation of the peak frequencies is examined. Both consistent and inconsistent order determining criteria are considered, where the inconsistent methods asymptotically overestimate the true order of the process
1 edition published in 1986 in English and held by 3 WorldCat member libraries worldwide
Least squares estimates are not as widely used as YuleWalker estimates when modeling autoregressive processes mainly because there is a wellknown algorithm, Levinson's Algorithm, which may be used to obtain the YuleWalker estimates for orders 1, . . ., M recursively but there is not a wellknown recursive algorithm for the least squares estimates. A recursive algorithm is presented to obtain the least squares estimates for orders 1 ..., M. In addition to the recursive algorithm, the effect of estimating the order of the process on the estimation of the peak frequencies is examined. Both consistent and inconsistent order determining criteria are considered, where the inconsistent methods asymptotically overestimate the true order of the process
Trade restricting policies under uncertainty and duopoly with differentiated products by
Kisoo Kim(
Book
)
1 edition published in 1988 in English and held by 3 WorldCat member libraries worldwide
This dissertation provides analyses of several trade restricting policies under uncertainty and duopoly with differentiated products. It considers nonequivalence of tariffs and quotas under uncertainty with a duopoly market and differentiated products when the behavior patterns of both producers are endogenously determined. It also examines how the two strategies of oligopolistic competition setting price or quantity affect the results of nonequivalence of tariffs and quotas in the Itoh and Ono model (1984) and the welfare and protective effects of other policy alternatives when the duopoly produces differentiated but substitute goods and competes under uncertainty. The dissertation extends the Itoh and Ono model (1984) in three directions. First, the dissertation developes the Itoh and Ono model under uncertainty using their assumptions about the firm's behavior. I derive the properties of home and foreign reaction functions, show how uncertainty shifts the curves, and compare the equilibria under certainty and uncertainty. Second, the dissertation examines how the two strategies of oligopolistic competition setting price or quantity affect the results of nonequivalence of tariffs and quotas in the Itoh and Ono model. When exogenous uncertainty about market demands is introduced, a monopolist is not, in general, indifferent between choosing a quantity then selling at the resulting market price and setting a price then producing what the market demands. Therefore the general nonequivalence of tariffs and quotas does not hold in the quantity setting model whereas it does hold in the price setting model. Third, the dissertation compares the effects of other trade restricting policies in addition to traditional tariff and quota policies under uncertainty and duopoly with differentiated products. It compares and contrasts the effects of an ad valorem tariff, a fixed quota, a variable quota, lump sum subsidies to the home producer, and fixed taxes on the foreign producer
1 edition published in 1988 in English and held by 3 WorldCat member libraries worldwide
This dissertation provides analyses of several trade restricting policies under uncertainty and duopoly with differentiated products. It considers nonequivalence of tariffs and quotas under uncertainty with a duopoly market and differentiated products when the behavior patterns of both producers are endogenously determined. It also examines how the two strategies of oligopolistic competition setting price or quantity affect the results of nonequivalence of tariffs and quotas in the Itoh and Ono model (1984) and the welfare and protective effects of other policy alternatives when the duopoly produces differentiated but substitute goods and competes under uncertainty. The dissertation extends the Itoh and Ono model (1984) in three directions. First, the dissertation developes the Itoh and Ono model under uncertainty using their assumptions about the firm's behavior. I derive the properties of home and foreign reaction functions, show how uncertainty shifts the curves, and compare the equilibria under certainty and uncertainty. Second, the dissertation examines how the two strategies of oligopolistic competition setting price or quantity affect the results of nonequivalence of tariffs and quotas in the Itoh and Ono model. When exogenous uncertainty about market demands is introduced, a monopolist is not, in general, indifferent between choosing a quantity then selling at the resulting market price and setting a price then producing what the market demands. Therefore the general nonequivalence of tariffs and quotas does not hold in the quantity setting model whereas it does hold in the price setting model. Third, the dissertation compares the effects of other trade restricting policies in addition to traditional tariff and quota policies under uncertainty and duopoly with differentiated products. It compares and contrasts the effects of an ad valorem tariff, a fixed quota, a variable quota, lump sum subsidies to the home producer, and fixed taxes on the foreign producer
Alternative funds flow measures as predictors of failure by
Fannie Lee Malone(
Book
)
1 edition published in 1984 in English and held by 3 WorldCat member libraries worldwide
One approach to evaluating the usefulness of financial information is to measure the ability to predict the outcome of future events. This study will evaluate the merits of alternative methods of reporting funds flows in terms of the prediction of bankruptcy. The funds flow controversy provides three measurement alternativesnet income, working capital, and cash. A logistic regression model was constructed for each of the three measurement alternatives. The dependent variable in this study is whether an enterprise is bankrupt or nonbankrupt. The primary purpose of this study is to compare the predictive ability of the three measurements, net income, working capital, and cash. The secondary purpose is to measure how well a funds flow description of failure (AFFDOF) predicts the failure of an enterprise. Based on the evidence, it appears that there are no significant differences among the measurements in predicting bankruptcy. Also, it appears that the AFFDOF model is significant in predicting bankruptcy
1 edition published in 1984 in English and held by 3 WorldCat member libraries worldwide
One approach to evaluating the usefulness of financial information is to measure the ability to predict the outcome of future events. This study will evaluate the merits of alternative methods of reporting funds flows in terms of the prediction of bankruptcy. The funds flow controversy provides three measurement alternativesnet income, working capital, and cash. A logistic regression model was constructed for each of the three measurement alternatives. The dependent variable in this study is whether an enterprise is bankrupt or nonbankrupt. The primary purpose of this study is to compare the predictive ability of the three measurements, net income, working capital, and cash. The secondary purpose is to measure how well a funds flow description of failure (AFFDOF) predicts the failure of an enterprise. Based on the evidence, it appears that there are no significant differences among the measurements in predicting bankruptcy. Also, it appears that the AFFDOF model is significant in predicting bankruptcy
Coupling of Coastal Zone Color Scanner data to a physicalbiological model of the southeastern U.S. continental shelf ecosystem by Joji Ishizaka(
Book
)
1 edition published in 1989 in English and held by 3 WorldCat member libraries worldwide
Chlorophyll distributions on the southeastern U.S. continental shelf obtained from nine Coastal Zone Color Scanner (CZCS) images for April 1980 were used to verify a physicalbiological model for this region. The CZCS chlorophyll distributions were also assimilated into the model to improve the simulated chlorophyll distributions and associated phytoplankton flux calculations. Flow and temperature fields for an area 45km acrossshelf and 200km alongshelf were estimated by applying an optimal interpolation technique to current meter data collected in this area. These fields were combined with a biological model that included nutrient, phytoplankton, zooplankton, and detritus components. Without biological processes, simulated chlorophyll distributions accurately reproduced the CZCS chlorophyll distributions. When biological processes were included, upwelled nutrients were required to maintain the chlorophyll distributions. These results indicate that chlorophyll distribution on the southeastern U.S. continental shelf are controlled mainly by horizontal advection; however, phytoplankton growth associated with shelf break upwelling is necessary to maintain the observed patterns. Chlorophyll patterns obtained with the model were different from those derived by CZCS only when unrealistic biological parameter values were used. The magnitude of the chlorophyll concentration was affected by the phytoplankton, zooplankton, and regeneration parameter values. The major error sources within the model came from the estimation of the upwelling and the circulation. The integrated phytoplankton carbon and nutrient nitrogen fluxes across the 45 m and 75m isobaths for the model domain were estimated from the simulated chlorophyll and nutrient distributions. The fluxes showed considerable time variability; however, the integrated fluxes for April 1980 showed onshore nutrient and phytoplankton fluxes at the 45m isobath and an onshore nutrient flux at the 75m isobath. The phytoplankton carbon flux at the 75m isobath was offshore, indicating the possible export of carbon to offshore
1 edition published in 1989 in English and held by 3 WorldCat member libraries worldwide
Chlorophyll distributions on the southeastern U.S. continental shelf obtained from nine Coastal Zone Color Scanner (CZCS) images for April 1980 were used to verify a physicalbiological model for this region. The CZCS chlorophyll distributions were also assimilated into the model to improve the simulated chlorophyll distributions and associated phytoplankton flux calculations. Flow and temperature fields for an area 45km acrossshelf and 200km alongshelf were estimated by applying an optimal interpolation technique to current meter data collected in this area. These fields were combined with a biological model that included nutrient, phytoplankton, zooplankton, and detritus components. Without biological processes, simulated chlorophyll distributions accurately reproduced the CZCS chlorophyll distributions. When biological processes were included, upwelled nutrients were required to maintain the chlorophyll distributions. These results indicate that chlorophyll distribution on the southeastern U.S. continental shelf are controlled mainly by horizontal advection; however, phytoplankton growth associated with shelf break upwelling is necessary to maintain the observed patterns. Chlorophyll patterns obtained with the model were different from those derived by CZCS only when unrealistic biological parameter values were used. The magnitude of the chlorophyll concentration was affected by the phytoplankton, zooplankton, and regeneration parameter values. The major error sources within the model came from the estimation of the upwelling and the circulation. The integrated phytoplankton carbon and nutrient nitrogen fluxes across the 45 m and 75m isobaths for the model domain were estimated from the simulated chlorophyll and nutrient distributions. The fluxes showed considerable time variability; however, the integrated fluxes for April 1980 showed onshore nutrient and phytoplankton fluxes at the 45m isobath and an onshore nutrient flux at the 75m isobath. The phytoplankton carbon flux at the 75m isobath was offshore, indicating the possible export of carbon to offshore
Stata technical bulletin reprints(
Book
)
1 edition published in 1998 in English and held by 3 WorldCat member libraries worldwide
1 edition published in 1998 in English and held by 3 WorldCat member libraries worldwide
Estimation in spatial time series by
Gary Richard Stevens(
Book
)
1 edition published in 1986 in English and held by 3 WorldCat member libraries worldwide
Several different methods of modeling and analyzing spatial data are discussed. The problems associated with estimation of the parameters of the models are noted and various methods to avoid these problems are presented. The asymptotic properties for the YuleWalker and least squares estimators for the parameters of unilateral models are stated. The small sample properties of these estimators along with the small sample properties of various order determining criteria are investigated using a simulation study. A method for modeling the data using the spectral density is presented and asymptotic confidence bounds for the spectral density are derived. Also estimators for the peak frequencies of the spectral density and their asymptotic properties are derived
1 edition published in 1986 in English and held by 3 WorldCat member libraries worldwide
Several different methods of modeling and analyzing spatial data are discussed. The problems associated with estimation of the parameters of the models are noted and various methods to avoid these problems are presented. The asymptotic properties for the YuleWalker and least squares estimators for the parameters of unilateral models are stated. The small sample properties of these estimators along with the small sample properties of various order determining criteria are investigated using a simulation study. A method for modeling the data using the spectral density is presented and asymptotic confidence bounds for the spectral density are derived. Also estimators for the peak frequencies of the spectral density and their asymptotic properties are derived
Modern Methods of Multiple Spectral Density Estimation by
H. Joseph Newton(
Book
)
4 editions published between 1979 and 1980 in English and held by 3 WorldCat member libraries worldwide
Two pages to a leaf
4 editions published between 1979 and 1980 in English and held by 3 WorldCat member libraries worldwide
Two pages to a leaf
An investigation into the causes of nonmartingale behavior in commodity futures prices by
Scott Wesley Barnhart(
Book
)
1 edition published in 1984 in English and held by 2 WorldCat member libraries worldwide
The weak form of the efficient markets hypothesis has been tested extensively in financial asset markets. The results of these tests indicate that the hypothesis that stock prices fully reflect available information generally cannot be rejected. When tested in commodity futures markets, however, the hypothesis does not retain its impeccable character. A survey of tests of efficiency in commodity markets reveals significant departures from the standard concepts of weak form market efficiency. The purpose of this research is to explain the causes or reasons for nonmartingale behavior in commodity futures prices. The commodity futures market is modelled in a stochastic rational expectations structure in which risk averse firms and speculators maximize the expected utility of profits. Assuming a given stochastic aggregate demand, the aggregate supply and inventory demand functions are derived from the optimization processes of the individual participants in the market. From these aggregate relations the equilibrium spot and future prices are simultaneously determined. Emprically testable hypotheses concerning nonmartingale behavior in futures prices are derived from the model. These hypotheses are compared to the standard tests of weak form market efficiency in commodity futures markets. The standard tests of market efficiency that are used in this study are univariate time series analysis in the time and frequency domains. In addition, the structural relations of the cash market, specified in the model, are estimated for various commodities. These results are used to analyze the relation between the structural parameters and variances of the model and the time series properties of futures prices. Other potential causes of nonmartingale behavior in futures prices are also examined. In particular, the question of market thinness is examined through an analysis of the total volume and open interest in these markets. Evidence is presented that significantly links nonmartingale behavior in futures prices to the coefficient of variation from the estimated equilibrium solution for the cash price. Furthermore, the average total volume and open interest in the commodity futures markets studied cannot significantly explain the nonmartingale behavior. However, the coefficient of variation of total volume and open interest does significantly explain it
1 edition published in 1984 in English and held by 2 WorldCat member libraries worldwide
The weak form of the efficient markets hypothesis has been tested extensively in financial asset markets. The results of these tests indicate that the hypothesis that stock prices fully reflect available information generally cannot be rejected. When tested in commodity futures markets, however, the hypothesis does not retain its impeccable character. A survey of tests of efficiency in commodity markets reveals significant departures from the standard concepts of weak form market efficiency. The purpose of this research is to explain the causes or reasons for nonmartingale behavior in commodity futures prices. The commodity futures market is modelled in a stochastic rational expectations structure in which risk averse firms and speculators maximize the expected utility of profits. Assuming a given stochastic aggregate demand, the aggregate supply and inventory demand functions are derived from the optimization processes of the individual participants in the market. From these aggregate relations the equilibrium spot and future prices are simultaneously determined. Emprically testable hypotheses concerning nonmartingale behavior in futures prices are derived from the model. These hypotheses are compared to the standard tests of weak form market efficiency in commodity futures markets. The standard tests of market efficiency that are used in this study are univariate time series analysis in the time and frequency domains. In addition, the structural relations of the cash market, specified in the model, are estimated for various commodities. These results are used to analyze the relation between the structural parameters and variances of the model and the time series properties of futures prices. Other potential causes of nonmartingale behavior in futures prices are also examined. In particular, the question of market thinness is examined through an analysis of the total volume and open interest in these markets. Evidence is presented that significantly links nonmartingale behavior in futures prices to the coefficient of variation from the estimated equilibrium solution for the cash price. Furthermore, the average total volume and open interest in the commodity futures markets studied cannot significantly explain the nonmartingale behavior. However, the coefficient of variation of total volume and open interest does significantly explain it
An investigation of the effects of variance components on the performance of job shop dispatching policies by
James Ralph Wood(
Book
)
1 edition published in 1990 in English and held by 2 WorldCat member libraries worldwide
The objectives of this research are twofold. First, to investigate the impact of unequal processing time variances on overall shop performance. Second, to implement strategies for improving job shop performance by designing priority dispatching policies that include measures of system statistical variability. The approach is through a statistical experiment in which processing time variances, at specified machines, are systematically increased while holding processing means constant. The job shops modeled herein deviate from mainstream research which traditionally assumes that processing times are known prior to the implementation of the priority rules. Instead, this study examines a stochastic environment in which operation processing times are random variables from mutually independent probability distributions. It is assumed that the only information available for jobs waiting for service, are their operation processing time distribution. Hence, the priority dispatching policies examined herein are based solely on means and variances instead of exact information. A capstone of this research is the development of four new priority dispatching policies that include components of system variability. In simulated cases these four rules are compared against five commonly studied dispatching rules. The results indicate that that the four variance inclusive rules perform as well or better than the five traditional dispatching rules, in every criterion measured
1 edition published in 1990 in English and held by 2 WorldCat member libraries worldwide
The objectives of this research are twofold. First, to investigate the impact of unequal processing time variances on overall shop performance. Second, to implement strategies for improving job shop performance by designing priority dispatching policies that include measures of system statistical variability. The approach is through a statistical experiment in which processing time variances, at specified machines, are systematically increased while holding processing means constant. The job shops modeled herein deviate from mainstream research which traditionally assumes that processing times are known prior to the implementation of the priority rules. Instead, this study examines a stochastic environment in which operation processing times are random variables from mutually independent probability distributions. It is assumed that the only information available for jobs waiting for service, are their operation processing time distribution. Hence, the priority dispatching policies examined herein are based solely on means and variances instead of exact information. A capstone of this research is the development of four new priority dispatching policies that include components of system variability. In simulated cases these four rules are compared against five commonly studied dispatching rules. The results indicate that that the four variance inclusive rules perform as well or better than the five traditional dispatching rules, in every criterion measured
more
fewer
Audience Level
0 

1  
Kids  General  Special 
Related Identities
Associated Subjects
Autoregression (Statistics) Bankruptcy ChlorophyllMeasurement Commerce Commodity exchanges Computer graphics Computer interfaces Continental shelf Data editing Duopolies Economics, Mathematical EconomicsStatistical methods Electric cablesFault location Electric fault locationData processing Fault location (Engineering) Flow of funds International economic relations Marine biologyRemote sensing Mathematical statisticsComputerassisted instruction Mathematical statisticsData processing Medical statistics Metaanalysis Numerical analysisData processing Phytoplankton Production control Production scheduling Social sciencesStatistical methods Social sciencesStatistical methodsComputer programs Spatial analysis (Statistics) Spectral energy distribution Stata Statistics StatisticsComputer programs StatisticsData processing Timeseries analysis Timeseries analysisData processing TIMESLAB Uncertainty (Information theory) United States