Zenil, Hector
Overview
Works:  9 works in 62 publications in 2 languages and 3,098 library holdings 

Genres:  Conference papers and proceedings 
Roles:  Editor, Author 
Classifications:  QA274, 003.54 
Publication Timeline
.
Most widely held works by
Hector Zenil
Randomness through computation : some answers, more questions(
Book
)
16 editions published in 2011 in English and held by 150 WorldCat member libraries worldwide
The scope of Randomness Through Computation is novel Contributors share their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points
16 editions published in 2011 in English and held by 150 WorldCat member libraries worldwide
The scope of Randomness Through Computation is novel Contributors share their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points
A computable universe : understanding and exploring nature as computation by
Hector Zenil(
Book
)
19 editions published between 2012 and 2013 in English and Undetermined and held by 99 WorldCat member libraries worldwide
This volume, with a foreword by Sir Roger Penrose, discusses the foundations of computation in relation to nature.It focuses on two main questions: What is computation? How does nature compute?The contributors are worldrenowned experts who have helped shape a cuttingedge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and philosophical implications.The volume provides a stateoftheart collection of technical papers and nontechnical essays, re
19 editions published between 2012 and 2013 in English and Undetermined and held by 99 WorldCat member libraries worldwide
This volume, with a foreword by Sir Roger Penrose, discusses the foundations of computation in relation to nature.It focuses on two main questions: What is computation? How does nature compute?The contributors are worldrenowned experts who have helped shape a cuttingedge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and philosophical implications.The volume provides a stateoftheart collection of technical papers and nontechnical essays, re
Irreducibility and computational equivalence : 10 years after Wolfram's A new kind of science by
Hector Zenil(
Book
)
10 editions published between 2013 and 2015 in English and held by 39 WorldCat member libraries worldwide
It is clear that computation is playing an increasingly prominent role in the development of mathematics, as well as in the natural and social sciences. The work of Stephen Wolfram over the last several decades has been a salient part in this phenomenon helping founding the field of Complex Systems, with many of his constructs and ideas incorporated in his book A New Kind of Science (ANKS) becoming part of the scientific discourse and general academic knowledgefrom the now established Elementary Cellular Automata to the unconventional concept of mining the Computational Universe, from today's widespread Wolfram's Behavioural Classification to his principles of Irreducibility and Computational Equivalence. This volume, with a Foreword by Gregory Chaitin and an Afterword by Cris Calude, covers these and other topics related to or motivated by Wolfram's seminal ideas, reporting on research undertaken in the decade following the publication of Wolfram's NKS book. Featuring 39 authors, its 23 contributions are organized into seven parts: Mechanisms in Programs & NatureSystems Based on Numbers & Simple Programs Social and Biological Systems & Technology Fundamental PhysicsThe Behavior of Systems & the Notion of ComputationIrreducibility & Computational EquivalenceReflections and Philosophical Implications."I found this volume fascinating in its efforts to flesh out the computational implications for biology more generally."Dr. Mark Changizi "I believe that this book will be an inspiration for future work in interdisciplinary research at the intersection of computer science, natural and social sciences."  Prof. Ivan Zelinka
10 editions published between 2013 and 2015 in English and held by 39 WorldCat member libraries worldwide
It is clear that computation is playing an increasingly prominent role in the development of mathematics, as well as in the natural and social sciences. The work of Stephen Wolfram over the last several decades has been a salient part in this phenomenon helping founding the field of Complex Systems, with many of his constructs and ideas incorporated in his book A New Kind of Science (ANKS) becoming part of the scientific discourse and general academic knowledgefrom the now established Elementary Cellular Automata to the unconventional concept of mining the Computational Universe, from today's widespread Wolfram's Behavioural Classification to his principles of Irreducibility and Computational Equivalence. This volume, with a Foreword by Gregory Chaitin and an Afterword by Cris Calude, covers these and other topics related to or motivated by Wolfram's seminal ideas, reporting on research undertaken in the decade following the publication of Wolfram's NKS book. Featuring 39 authors, its 23 contributions are organized into seven parts: Mechanisms in Programs & NatureSystems Based on Numbers & Simple Programs Social and Biological Systems & Technology Fundamental PhysicsThe Behavior of Systems & the Notion of ComputationIrreducibility & Computational EquivalenceReflections and Philosophical Implications."I found this volume fascinating in its efforts to flesh out the computational implications for biology more generally."Dr. Mark Changizi "I believe that this book will be an inspiration for future work in interdisciplinary research at the intersection of computer science, natural and social sciences."  Prof. Ivan Zelinka
How nature works : complexity in interdisciplinary research and applications by
Ivan Zelinka(
Book
)
10 editions published in 2014 in English and held by 29 WorldCat member libraries worldwide
This book is based on the outcome of the ""2012 Interdisciplinary Symposium on Complex Systems"" held at the island of Kos. The book consists of 12 selected papers of the symposium starting with a comprehensive overview and classification of complexity problems, continuing by chapters about complexity, its observation, modeling and its applications to solving various problems including reallife applications. More exactly, readers will have an encounter with the structural complexity of vortex flows, the use of chaotic dynamics within evolutionary algorithms, complexity in synthetic biology. 
10 editions published in 2014 in English and held by 29 WorldCat member libraries worldwide
This book is based on the outcome of the ""2012 Interdisciplinary Symposium on Complex Systems"" held at the island of Kos. The book consists of 12 selected papers of the symposium starting with a comprehensive overview and classification of complexity problems, continuing by chapters about complexity, its observation, modeling and its applications to solving various problems including reallife applications. More exactly, readers will have an encounter with the structural complexity of vortex flows, the use of chaotic dynamics within evolutionary algorithms, complexity in synthetic biology. 
Randomness through computation some answers, more questions(
)
1 edition published in 2011 in English and held by 6 WorldCat member libraries worldwide
This review volume consists of a set of chapters written by leading scholars, most of them founders of their fields. It explores the connections of Randomness to other areas of scientific knowledge, especially its fruitful relationship to Computability and Complexity Theory, and also to areas such as Probability, Statistics, Information Theory, Biology, Physics, Quantum Mechanics, Learning Theory and Artificial Intelligence. The contributors cover these topics without neglecting important philosophical dimensions, sometimes going beyond the purely technical to formulate age old questions relating to matters such as determinism and free will. The scope of Randomness Through Computation is novel. Each contributor shares their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points
1 edition published in 2011 in English and held by 6 WorldCat member libraries worldwide
This review volume consists of a set of chapters written by leading scholars, most of them founders of their fields. It explores the connections of Randomness to other areas of scientific knowledge, especially its fruitful relationship to Computability and Complexity Theory, and also to areas such as Probability, Statistics, Information Theory, Biology, Physics, Quantum Mechanics, Learning Theory and Artificial Intelligence. The contributors cover these topics without neglecting important philosophical dimensions, sometimes going beyond the purely technical to formulate age old questions relating to matters such as determinism and free will. The scope of Randomness Through Computation is novel. Each contributor shares their personal views and anecdotes on the various reasons and motivations which led them to the study of Randomness. Using a question and answer format, they share their visions from their several distinctive vantage points
William Shakespeare by
Hector Zenil(
Book
)
2 editions published in 2005 in Spanish and held by 5 WorldCat member libraries worldwide
2 editions published in 2005 in Spanish and held by 5 WorldCat member libraries worldwide
Irreducibility and computational equivalence : 10 Years after Wolfram's A new kind of science$HHector Zenil (ed.) by
Hector Zenil(
Book
)
2 editions published between 2013 and 2015 in English and held by 4 WorldCat member libraries worldwide
2 editions published between 2013 and 2015 in English and held by 4 WorldCat member libraries worldwide
Une approche expérimentale à la théorie algorithmique de la complexité by
Hector Zenil(
Book
)
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
Une caractéristique contraignante de la complexité de KolmogorovChaitin (dénotée dans ce chapitre par K) est qu'elle n'est pas calculable à cause du problème de l'arrêt, ce qui limite son domaine d'application. Une autre critique concerne la dépendance de K à un langage particulier ou une machine de Turing universelle particulière, surtout pour les suites assez courtes, par exemple, plus courtes que les longueurs typiques des compilateurs des langages de programmation. En pratique, on peut obtenir une approximation de K(s), grâce à des méthodes de compression. Mais les performances de ces méthodes de compression s'écroulent quand il s'agit des suites courtes. Pour les courtes suites, approcher K(s) par des méthodes de compression ne fonctionne pas. On présente dans cet thèse une approche empirique pour surmonter ce problème. Nous proposons une méthode "naturelle" qui permet d'envisager une définition plus stable de la complexité de KolmogorovChaitin K(s) via la mesure de probabilité algorithmique m(s). L'idée est de faire fonctionner une machine universelle en lui donnant un programme au hasard pour calculer expérimentalement la probabilité m(s) (la probabilité de produire s), pour ensuite évaluer numériquement K(s) de manière alternative aux méthodes des algorithmes de compression. La méthode consiste à : (a) faire fonctionner des machines de calcul (machines de Turing, automates cellulaires) de façon systématique pour produire des suites (b) observer quelles sont les distributions de probabilité obtenues et puis (c) obtenir K(s) à partir de m(s) par moyen du théorème de codage de LevinChaitin
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
Une caractéristique contraignante de la complexité de KolmogorovChaitin (dénotée dans ce chapitre par K) est qu'elle n'est pas calculable à cause du problème de l'arrêt, ce qui limite son domaine d'application. Une autre critique concerne la dépendance de K à un langage particulier ou une machine de Turing universelle particulière, surtout pour les suites assez courtes, par exemple, plus courtes que les longueurs typiques des compilateurs des langages de programmation. En pratique, on peut obtenir une approximation de K(s), grâce à des méthodes de compression. Mais les performances de ces méthodes de compression s'écroulent quand il s'agit des suites courtes. Pour les courtes suites, approcher K(s) par des méthodes de compression ne fonctionne pas. On présente dans cet thèse une approche empirique pour surmonter ce problème. Nous proposons une méthode "naturelle" qui permet d'envisager une définition plus stable de la complexité de KolmogorovChaitin K(s) via la mesure de probabilité algorithmique m(s). L'idée est de faire fonctionner une machine universelle en lui donnant un programme au hasard pour calculer expérimentalement la probabilité m(s) (la probabilité de produire s), pour ensuite évaluer numériquement K(s) de manière alternative aux méthodes des algorithmes de compression. La méthode consiste à : (a) faire fonctionner des machines de calcul (machines de Turing, automates cellulaires) de façon systématique pour produire des suites (b) observer quelles sont les distributions de probabilité obtenues et puis (c) obtenir K(s) à partir de m(s) par moyen du théorème de codage de LevinChaitin
Une approche expérimentale à la théorie algorithmique de la complexité by
Hector Zenil(
)
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
In practice, it is a known problem that one cannot compress short strings, shorter, for example, than the length in bits of the compression program which is added to the compressed version of s, making the result (the program producing s) sensitive to the compressor choice and the parameters involved. However, short strings are quite often the kind of data encountered in many practical settings. While compressors' asymptotic behavior guarantees the eventual convergence to K(s) thanks to the invariance theorem, measurements differ considerably in the domain of short strings. We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of short bit strings. This is done by an exhaustive execution of abstract machines for which the halting times are known thanks to the Busy Beaver problem. An output frequency distribution is then computed, from which the algorithmic probability is calculated and the algorithmic complexity evaluated by way of the (LevinChaitin) coding theorem. The approach we adopt here is different and independent of the machine size (small machines are used only because they allow us to calculate all of them up to a small size) and relies only on the concept of algorithmic probability. The result is a novel approach that we put forward for numerically calculate the complexity of short strings as an alternative to the indirect method using compression algorithms
1 edition published in 2011 in English and held by 1 WorldCat member library worldwide
In practice, it is a known problem that one cannot compress short strings, shorter, for example, than the length in bits of the compression program which is added to the compressed version of s, making the result (the program producing s) sensitive to the compressor choice and the parameters involved. However, short strings are quite often the kind of data encountered in many practical settings. While compressors' asymptotic behavior guarantees the eventual convergence to K(s) thanks to the invariance theorem, measurements differ considerably in the domain of short strings. We describe a method that combines several theoretical and experimental results to numerically approximate the algorithmic (KolmogorovChaitin) complexity of short bit strings. This is done by an exhaustive execution of abstract machines for which the halting times are known thanks to the Busy Beaver problem. An output frequency distribution is then computed, from which the algorithmic probability is calculated and the algorithmic complexity evaluated by way of the (LevinChaitin) coding theorem. The approach we adopt here is different and independent of the machine size (small machines are used only because they allow us to calculate all of them up to a small size) and relies only on the concept of algorithmic probability. The result is a novel approach that we put forward for numerically calculate the complexity of short strings as an alternative to the indirect method using compression algorithms
Audience Level
0 

1  
Kids  General  Special 
Related Identities
 Penrose, Roger Author of introduction
 Zelinka, Ivan Author Editor
 Sanayei, Ali Editor
 Rössler, Otto E. Editor
 SpringerLink (Online service)
 World Scientific (Firm)
 École doctorale Sciences pour l'Ingénieur (Lille)
 Delahaye, JeanPaul (1952....). Thesis advisor
 Université Lille 1  Sciences et technologies Degree grantor
 Cortés, Enrique