The OCLC Research WorldCat Identities project is ending. The work provided valuable insight into how to mine bibliographic data for insight into the People and Organizations that create and serve as subjects library materials. The findings demonstrated the power of data mining over the world’s largest set of bibliographic metadata and highlighted the value of collaborative collective cataloging. The WorldCat Identities data has positively impacted OCLC’s work to build the WorldCat Entities application and the 150 million Person and Work descriptions accessible through it. This work will continue to build our entity ecosystem to support the future knowledge work of librarians.

The WorldCat Identities web application will be retired and shut down in the coming months and the data is no longer being updated. The most recent version of the data is from July of 2022. As OCLC continues to build out the WorldCat Entities ecosystem, please use it as a source for persistent Person identifiers.

WorldCat Identities

Kreinovich, Vladik

Works: 136 works in 579 publications in 2 languages and 10,889 library holdings
Genres: Handbooks and manuals  Conference papers and proceedings  Academic theses 
Roles: Editor, Author, htt, Other, Collector, Contributor, Publishing director, Compiler
Classifications: QA76.9.S63, 006.3
Publication Timeline
Most widely held works by Vladik Kreinovich
Constraint programming and decision making by Martine Ceberio( )

22 editions published between 2014 and 2018 in English and German and held by 577 WorldCat member libraries worldwide

In many application areas, it is necessary to make effective decisions under constraints. Several area-specific techniques are known for such decision problems; however, because these techniques are area-specific, it is not easy to apply each technique to other applications areas. Cross-fertilization between different application areas is one of the main objectives of the annual International Workshops on Constraint Programming and Decision Making. Those workshops, held in the US (El Paso, Texas), in Europe (Lyon, France), and in Asia (Novosibirsk, Russia), from 2008 to 2012, have attracted researchers and practitioners from all over the world. This volume presents extended versions of selected papers from those workshops. These papers deal with all stages of decision making under constraints: (1) formulating the problem of multi-criteria decision making in precise terms, (2) determining when the corresponding decision problem is algorithmically solvable; (3) finding the corresponding algorithms, and making these algorithms as efficient as possible; and (4) taking into account interval, probabilistic, and fuzzy uncertainty inherent in the corresponding decision making problems. The resulting application areas include environmental studies (selecting the best location for a meteorological tower), biology (selecting the most probable evolution history of a species), and engineering (designing the best control for a magnetic levitation train)
Uncertainty analysis in econometrics with applications by Thailand Econometric Society( )

17 editions published between 2012 and 2013 in English and German and held by 411 WorldCat member libraries worldwide

Unlike uncertain dynamical systems in physical sciences where models for prediction are somewhat given to us by physical laws, uncertain dynamical systems in economics need statistical models. In this context, modeling and optimization surface as basic ingredients for fruitful applications. This volume concentrates on the current methodology of copulas and maximum entropy optimization. This volume contains main research presentations at the Sixth International Conference of the Thailand Econometrics Society held at the Faculty of Economics, Chiang Mai University, Thailand, during January 10-11, 2013. It consists of keynote addresses, theoretical and applied contributions. These contributions to Econometrics are somewhat centered around the theme of Copulas and Maximum Entropy Econometrics. The method of copulas is applied to a variety of economic problems where multivariate model building and correlation analysis are needed. As for the art of choosing copulas in practical problems, the principle of maximum entropy surfaces as a potential way to do so. The state-of-the-art of Maximum Entropy Econometrics is presented in the first keynote address, while the second keynote address focusses on testing stationarity in economic time series data
Modeling dependence in econometrics by Thailand Econometric Society( )

19 editions published between 2013 and 2014 in English and held by 371 WorldCat member libraries worldwide

In economics, many quantities are related to each other. Such economic relations are often much more complex than relations in science and engineering, where some quantities are independence, and the relation between others can be well approximated by linear functions. As a result of this complexity, when we apply traditional statistical techniques -- developed for science and engineering -- to process economic data, the inadequate treatment of dependence leads to misleading models and erroneous predictions. Some economists even blamed such inadequate treatment of dependence for the 2008 financial crisis. To make economic models more adequate, we need more accurate techniques for describing dependence. Such techniques are currently being developed. This book contains description of state-of-the-art techniques for modeling dependence, and economic applications of these techniques. Most of these research developments are centered around the notion of a copula -- a general way of describing dependence in probability theory and statistics. To be even more adequate, many papers go beyond traditional copula techniques and take into account, e.g., the dynamical (changing) character of the dependence in economics
Recent developments and the new direction in soft-computing foundations and applications : selected papers from the 6th World Conference on Soft Computing, May 22-25, 2016, Berkeley, USA by WCSC (Conference : Soft computing)( )

15 editions published between 2018 and 2021 in English and held by 359 WorldCat member libraries worldwide

This book is an authoritative collection of contributions in the field of soft-computing. Based on selected works presented at the 6th World Conference on Soft Computing, held on May 22-25, 2016, in Berkeley, USA, it describes new theoretical advances, as well as cutting-edge methods and applications. Theories cover a wealth of topics, such as fuzzy logic, cognitive modeling, Bayesian and probabilistic methods, multi-criteria decision making, utility theory, approximate reasoning, human-centric computing and many others. Applications concerns a number of fields, such as internet and semantic web, social networks and trust, control and robotics, computer vision, medicine and bioinformatics, as well as finance, security and e-Commerce, among others. Dedicated to the 50th Anniversary of Fuzzy Logic and to the 95th Birthday Anniversary of Lotfi A. Zadeh, the book not only offers a timely view on the field, yet it also discusses thought-provoking developments and challenges, thus fostering new research directions in the diverse areas of soft computing
Algorithmic aspects of analysis, prediction, and control in science and engineering : an approach based on symmetry and similarity by Jaime Nava( )

19 editions published between 2014 and 2016 in English and held by 337 WorldCat member libraries worldwide

This book demonstrates how to describe and analyze a system's behavior and extract the desired prediction and control algorithms from this analysis. A typical prediction is based on observing similar situations in the past, knowing the outcomes of these past situations, and expecting that the future outcome of the current situation will be similar to these past observed outcomes. In mathematical terms, similarity corresponds to symmetry, and similarity of outcomes to invariance. This book shows how symmetries can be used in all classes of algorithmic problems of sciences and engineering: from analysis to prediction to control. Applications cover chemistry, geosciences, intelligent control, neural networks, quantum physics, and thermal physics. Specifically, it is shown how the approach based on symmetry and similarity can be used in the analysis of real-life systems, in the algorithms of prediction, and in the algorithms of control
Econometrics of risk by Van-Nam Huynh( )

16 editions published between 2014 and 2015 in English and held by 335 WorldCat member libraries worldwide

This edited book contains several state-of-the-art papers devoted to econometrics of risk. Some papers provide theoretical analysis of the corresponding mathematical, statistical, computational, and economical models. Other papers describe applications of the novel risk-related econometric techniques to real-life economic situations. The book presents new methods developed just recently, in particular, methods using non-Gaussian heavy-tailed distributions, methods using non-Gaussian copulas to properly take into account dependence between different quantities, methods taking into account imprecise ("fuzzy") expert knowledge, and many other innovative techniques. This versatile volume helps practitioners to learn how to apply new techniques of econometrics of risk, and researchers to further improve the existing models and to come up with new ideas on how to best take into account economic risks
Handbook of granular computing by Witold Pedrycz( )

9 editions published in 2008 in English and held by 331 WorldCat member libraries worldwide

Although the notion is a relatively recent one, the notions and principles of Granular Computing (GrC) have appeared in a different guise in many related fields including granularity in Artificial Intelligence, interval computing, cluster analysis, quotient space theory and many others. Recent years have witnessed a renewed and expanding interest in the topic as it begins to play a key role in bioinformatics, e-commerce, machine learning, security, data mining and wireless mobile computing when it comes to the issues of effectiveness, robustness and uncertainty. The Handbook of Granular Comput
Advance trends in soft computing : proceedings of WCSC 2013, December 16-18, San Antonio, Texas, USA by Mohammad Jamshidi( )

11 editions published in 2014 in English and held by 310 WorldCat member libraries worldwide

This book is the proceedings of the 3rd World Conference on Soft Computing (WCSC), which was held in San Antonio, TX, USA, on December 16-18, 2013. It presents start-of-the-art theory and applications of soft computing together with an in-depth discussion of current and future challenges in the field, providing readers with a 360 degree view on soft computing
Causal inference in econometrics by Van-Nam Huynh( )

15 editions published between 2015 and 2018 in English and German and held by 293 WorldCat member libraries worldwide

"This book is devoted to the analysis of causal inference which is one of the most difficult tasks in data analysis: when two phenomena are observed to be related, it is often difficult to decide whether one of them causally influences the other one, or whether these two phenomena have a common cause. This analysis is the main focus of this volume. To get a good understanding of the causal inference, it is important to have models of economic phenomena which are as accurate as possible. Because of this need, this volume also contains papers that use non-traditional economic models, such as fuzzy models and models obtained by using neural networks and data mining techniques. It also contains papers that apply different econometric models to analyze real-life economic dependencies"--Publisher's description
Propagation of interval and probabilistic uncertainty in cyberinfrastructure-related data processing and data fusion by Christian Servín Meneses( )

12 editions published between 2014 and 2015 in English and held by 284 WorldCat member libraries worldwide

"On various examples ranging from geosciences to environmental sciences, this book explains how to generate an adequate description of uncertainty, how to justify semiheuristic algorithms for processing uncertainty, and how to make these algorithms more computationally efficient. It explains in what sense the existing approach to uncertainty as a combination of random and systematic components is only an approximation, presents a more adequate three-component model with an additional periodic error component, and explains how uncertainty propagation techniques can be extended to this model. The book provides a justification for a practically efficient heuristic technique (based on fuzzy decision-making). It explains how the computational complexity of uncertainty processing can be reduced. The book also shows how to take into account that in real life, the information about uncertainty is often only partially known, and, on several practical examples, explains how to extract the missing information about uncertainty from the available data." -- Back cover
Robustness in econometrics( )

17 editions published between 2017 and 2018 in English and German and held by 282 WorldCat member libraries worldwide

This book presents recent research on robustness in econometrics. Robust data processing techniques - i.e., techniques that yield results minimally affected by outliers - and their applications to real-life economic and financial situations are the main focus of this book. The book also discusses applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. In day-by-day data, we often encounter outliers that do not reflect the long-term economic trends, e.g., unexpected and abrupt fluctuations. As such, it is important to develop robust data processing techniques that can accommodate these fluctuations
Uncertainty modeling : dedicated to Professor Boris Kovalerchuk on his anniversary( )

14 editions published between 2017 and 2018 in English and German and held by 267 WorldCat member libraries worldwide

This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts
Predictive econometrics and big data by Vladik Kreinovich( )

13 editions published between 2017 and 2018 in English and held by 252 WorldCat member libraries worldwide

This book presents recent research on predictive econometrics and big data. Gathering edited papers presented at the 11th International Conference of the Thailand Econometric Society (TES2018), held in Chiang Mai, Thailand, on January 10-12, 2018, its main focus is on predictive techniques - which directly aim at predicting economic phenomena; and big data techniques - which enable us to handle the enormous amounts of data generated by modern computers in a reasonable time. The book also discusses the applications of more traditional statistical techniques to econometric problems. Econometrics is a branch of economics that employs mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. It is therefore important to develop data processing techniques that explicitly focus on prediction. The more data we have, the better our predictions will be. As such, these techniques are essential to our ability to process huge amounts of available data
Econometrics for financial applications by Ly Hoang ANH( )

13 editions published between 2017 and 2018 in English and held by 244 WorldCat member libraries worldwide

This book addresses both theoretical developments in and practical applications of econometric techniques to finance-related problems. It includes selected edited outcomes of the International Econometric Conference of Vietnam (ECONVN2018), held at Banking University, Ho Chi Minh City, Vietnam on January 15-16, 2018. Econometrics is a branch of economics that uses mathematical (especially statistical) methods to analyze economic systems, to forecast economic and financial dynamics, and to develop strategies for achieving desirable economic performance. An extremely important part of economics is finances: a financial crisis can bring the whole economy to a standstill and, vice versa, a smart financial policy can dramatically boost economic development. It is therefore crucial to be able to apply mathematical techniques of econometrics to financial problems. Such applications are a growing field, with many interesting results? and an even larger number of challenges and open problems
Towards analytical techniques for optimizing knowledge acquisition, processing, propagation, ... and use in cyberinfrastructure and big data by Leonardo Octavio Lerma( )

12 editions published between 2017 and 2018 in English and German and held by 242 WorldCat member libraries worldwide

This book describes analytical techniques for optimizing knowledge acquisition, processing, and propagation, especially in the contexts of cyber-infrastructure and big data. Further, it presents easy-to-use analytical models of knowledge-related processes and their applications. The need for such methods stems from the fact that, when we have to decide where to place sensors, or which algorithm to use for processing the data--we mostly rely on experts' opinions. As a result, the selected knowledge-related methods are often far from ideal. To make better selections, it is necessary to first create easy-to-use models of knowledge-related processes. This is especially important for big data, where traditional numerical methods are unsuitable. The book offers a valuable guide for everyone interested in big data applications: students looking for an overview of related analytical techniques, practitioners interested in applying optimization techniques, and researchers seeking to improve and expand on these techniques
Combining interval and probabilistic, and other types of uncertainty in engineering applications by Andrew M Pownuk( )

11 editions published in 2018 in English and held by 237 WorldCat member libraries worldwide

How can we solve engineering problems while taking into account data characterized by different types of measurement and estimation uncertainty : interval, probabilistic, fuzzy, etc.? This book provides a theoretical basis for arriving at such solutions, as well as case studies demonstrating how these theoretical ideas can be translated into practical applications in the geosciences, pavement engineering, etc. In all these developments, the authors' objectives were to provide accurate estimates of the resulting uncertainty; to offer solutions that require reasonably short computation times; to offer content that is accessible for engineers; and to be sufficiently general - so that readers can use the book for many different problems. The authors also describe how to make decisions under different types of uncertainty. The book offers a valuable resource for all practical engineers interested in better ways of gauging uncertainty, for students eager to learn and apply the new techniques, and for researchers interested in processing heterogeneous uncertainty
Bounded rationality in decision making under uncertainty : towards optimal granularity by Joseph A Lorkowski( )

4 editions published in 2018 in English and held by 234 WorldCat member libraries worldwide

This book addresses an intriguing question: are our decisions rational? It explains seemingly irrational human decision-making behavior by taking into account our limited ability to process information. It also shows with several examples that optimization under granularity restriction leads to observed human decision-making. Drawing on the Nobel-prize-winning studies by Kahneman and Tversky, researchers have found many examples of seemingly irrational decisions: e.g., we overestimate the probability of rare events. Our explanation is that since human abilities to process information are limited, we operate not with the exact values of relevant quantities, but with "granules" that contain these values. We show that optimization under such granularity indeed leads to observed human behavior. In particular, for the first time, we explain the mysterious empirical dependence of betting odds on actual probabilities. This book can be recommended to all students interested in human decision-making, to researchers whose work involves human decisions, and to practitioners who design and employ systems involving human decision-making ?so that they can better utilize our ability to make decisions under uncertainty
Soft computing in measurement and information acquisition by Leonid Reznik( )

10 editions published in 2003 in English and held by 234 WorldCat member libraries worldwide

The vigorous development of the internet and other information technologies have significantly expanded the amount and variety of sources of information available on decision making. This book presents the current trends of soft computing applications to the fields of measurements and information acquisition. Main topics are the production and presentation of information including multimedia, virtual environment, and computer animation as well as the improvement of decisions made on the basis of this information in various applications ranging from engineering to business. In order to make high-quality decisions, one has to fuse information of different kinds from a variety of sources with differing degrees of reliability and uncertainty. The necessity to use intelligent methodologies in the analysis of such systems is demonstrated as well as the inspiring relation of computational intelligence to its natural counterpart. This book includes several contributions demonstrating a further movement towards the interdisciplinary collaboration of the biological and computer sciences with examples from biology and robotics
Computational complexity and feasibility of data processing and interval computations by Vladik Kreinovich( )

14 editions published between 1997 and 2011 in English and held by 204 WorldCat member libraries worldwide

Targeted audience " Specialists in numerical computations, especially in numerical optimizaƯ tion, who are interested in designing algorithms with automatie result verƯ ification, and who would therefore be interested in knowing how general their algorithms caIi in principle be." Mathematicians and computer scientists who are interested in the theory 0/ computing and computational complexity, especially computational comƯ plexity of numerical computations." Students in applied mathematics and computer science who are interested in computational complexity of different numerical methods and in learning general techniques for estimating this computational complexity. The book is written with all explanations and definitions added, so that it can be used as a graduate level textbook. What this book .is about Data processing. In many real-life situations, we are interested in the value of a physical quantity y that is diflicult (or even impossible) to measure directly. For example, it is impossible to directly measure the amount of oil in an oil field or a distance to a star. Since we cannot measure such quantities directly, we measure them indirectly, by measuring some other quantities Xi and using the known relation between y and Xi'S to reconstruct y. The algorithm that transforms the results Xi of measuring Xi into an estimate fj for y is called data processing
Problems of reducing the exhaustive search( Book )

8 editions published between 1996 and 2012 in English and held by 198 WorldCat member libraries worldwide

This collection contains translations of papers on propositional satisfiability and related logical problems which appeared in Problemy Sokrashcheniya Perebora, published in Russian in 1987 by the Scientific Council "Cybernetics" of the USSR Academy of Sciences. The problems form the nucleus of this intensively developing area. This translation is dedicated to the memory of two remarkable Russian mathematicians, Sergei Maslov and his wife, Nina Maslova. Maslov is known as the originator of the inverse method in automated deduction, which was discovered at the same time as the resolution method
moreShow More Titles
fewerShow Fewer Titles
Audience Level
Audience Level
  General Special  
Audience level: 0.00 (from 0.00 for Constraint ... to 0.00 for Constraint ...)

WorldCat IdentitiesRelated Identities
Econometrics of risk
Uncertainty analysis in econometrics with applicationsModeling dependence in econometricsRecent developments and the new direction in soft-computing foundations and applications : selected papers from the 6th World Conference on Soft Computing, May 22-25, 2016, Berkeley, USAAlgorithmic aspects of analysis, prediction, and control in science and engineering : an approach based on symmetry and similarityEconometrics of riskHandbook of granular computingAdvance trends in soft computing : proceedings of WCSC 2013, December 16-18, San Antonio, Texas, USACausal inference in econometrics
Alternative Names
Kreinovič, Vladik

Kreinovič, Vladik 1952-

Kreinovich, V.

Kreinovich, V., 1952-

Kreinovich, V. (Vladik)

Kreinovich, Vladik.

Kreinovich, Vladik Y.

Krejnovič, Vladik

Krejnovič, Vladik, 1952-

Vladik Kreinovich Amerikaans informaticus

Vladik Kreinovich informaticien américain

Vladik Kreinovich informáticu teóricu estauxunidense

Vladik Kreinovitx matemàtic estatunidenc

Крейнович, Владик

Крейнович Владислав Я.

Крейнович, Владислав Яковлевич

English (265)

German (6)