WorldCat Identities

Couchot, Jean-François (1973-)

Works: 15 works in 23 publications in 2 languages and 28 library holdings
Roles: Thesis advisor, Opponent, Other, Author
Publication Timeline
Most widely held works by Jean-François Couchot
Hardware implementation of a pseudo random number generator based on chaotic iteration by Mohammed Bakiri( Book )

2 editions published in 2018 in English and held by 2 WorldCat member libraries worldwide

Security and cryptography are key elements in constrained devices such as IoT, smart card, embedded system, etc. Their hardware implementations represent a challenge in terms of limitations in physical resources, operating speed, memory capacity, etc. In this context, as most protocols rely on the security of a good random number generator, considered an indispensable element in lightweight security core. Therefore, this work proposes new pseudo-random generators based on chaotic iterations, and designed to be deployed on hardware support, namely FPGA or ASIC. These hardware implementations can be described as post-processing on existing generators. They transform a sequence of numbers not uniform into another sequence of numbers uniform. The dependency between input and output has been proven chaotic, according notably to the mathematical definitions of chaos provided by Devaney and Li-Yorke. Following that, we firstly elaborate or develop out a complete state of the art of the material and physical implementations of pseudo-random number generators (PRNG, for pseudorandom number generators). We then propose new generators based on chaotic iterations (IC) which will be tested on our hardware platform. The initial idea was to start from the n-cube (or, in an equivalent way, the vectorial negation in CIs), then remove a Hamiltonian cycle balanced enough to produce new functions to be iterated, for which is added permutation on output . The methods recommended to find good functions, will be detailed, and the whole will be implemented on our FPGA platform. The resulting generators generally have a better statistical profiles than its inputs, while operating at a high speed. Finally, we will implement them on many hardware support (65-nm ASIC circuit and Zynq FPGA platform)
Normalized blind STDM watermarking scheme for images and PDF documents robust against fixed gain attack by Makram W Hatoum( )

1 edition published in 2019 in English and held by 2 WorldCat member libraries worldwide

Digital watermarking for PDF documents and images : security, robustness and AI-based attack by Makram Hatoum( Book )

2 editions published in 2020 in English and held by 2 WorldCat member libraries worldwide

Technological development has its pros and cons. Nowadays, we can easily share, download, and upload digital content using the Internet. Also, malicious users can illegally change, duplicate, and distribute any kind of information, such as images and documents. Therefore, we should protect such contents and arrest the perpetrator. The goal of this thesis is to protect PDF documents and images using the Spread Transform Dither Modulation (STDM), as a digital watermarking technique, while taking into consideration the main requirements of transparency, robustness, and security. STDM watermarking scheme achieved a good level of transparency and robustness against noise attacks. The key to this scheme is the projection vector that aims to spreads the embedded message over a set of cover elements. However, such a key vector can be estimated by unauthorized users using the Blind Source Separation (BSS) techniques. In our first contribution, we present our proposed CAR-STDM (Component Analysis Resistant-STDM) watermarking scheme, which guarantees security while preserving the transparency and robustness against noise attacks. STDM is also affected by the Fixed Gain Attack (FGA). In the second contribution, we present our proposed N-STDM watermarking scheme that resists the FGA attack and enhances the robustness against the Additive White Gaussian Noise (AWGN) attack, JPEG compression attack, and variety of filtering and geometric attacks. Experimentations have been conducted distinctly on PDF documents and images in the spatial domain and frequency domain. Recently, Deep Learning and Neural Networks achieved noticeable development and improvement, especially in image processing, segmentation, and classification. Diverse models such as Convolutional Neural Network (CNN) are exploited for modeling image priors for denoising. CNN has a suitable denoising performance, and it could be harmful to watermarked images. In the third contribution, we present the effect of a Fully Convolutional Neural Network (FCNN), as a denoising attack, on watermarked images. STDM and Spread Spectrum (SS) are used as watermarking schemes to embed the watermarks in the images using several scenarios. This evaluation shows that such type of denoising attack preserves the image quality while breaking the robustness of all evaluated watermarked schemes
Publishing set-valued dataset : strengthening the Disassociation approach to improve both privacy preservation and utility by Nancy Awad( Book )

2 editions published in 2020 in English and held by 2 WorldCat member libraries worldwide

This thesis addresses the problematic of anonymization for set-valued datasets, also known as transactional data. The work is based on an anonymization technique specific for set-valued data defined by Terrovitis as “Disassociation”. This technique works under the assumption that data values should not be altered, contrary to differential privacy, or suppressed, unlike k-anonymity. The duality character of disassociation is investigated. First, the position of disassociation facing data utility and knowledge extraction is evaluated and improved. Second, the truthfulness of disassociation towards protection of individuals' private life under its own privacy model, is studied and adjusted. On a first observation on disassociation, the utility of the information in a disassociated dataset is investigated. By reason of probabilistic analysis, it is proven that various associations in a disassociated dataset suffer from information loss. Therefore, to increase the utility value of a predefined set of associations, specified as “utility rules” by the user, the clustering process of disassociation is optimized, using ant-based clustering for the utility rules in question. Disassociation suffers from a privacy breach for homogeneity attacks, defined as the “cover problem” in 2016. To address this problem, a solution is proposed by using partial suppression and noise addition. The correctness of the solution is investigated and proven, where every cover problem is resolved and no new cover problem is generated by the proposed solution. Finally, as disassociation isn't a common data form, it is hard for machine Learning algorithms and data analyst to extract information and exploit the data in its current form. Re-expressing the data of the anonymized set-value datasets by disassociation in its original form, is a theoretical solution that can bring back data analysis techniques closer to anonymized data. A probabilistic re-association algorithm is thus proposed, sensitive to the probabilistic distribution of the associations in a cluster. This solution relies on an elaborated definition of neighbor datasets to prove its sensitivity and respect to the privacy constraints. The fidelity of the solution to data utility preservation is evaluated using the most exploited data analysis techniques over set-value data: mining frequent itemsets and association rules. In conclusion, this work digs deep in the field of anonymization for set-valued datasets. Starting from a defined anonymization technique known as disassociation, a privacy breach, the “cover problem”, is addressed for a solution and data utility is investigated within the disassociated dataset and for future uses. Results are impressive in terms of data utility and privacy preservation
Vérification d'invariants de systèmes paramétrés par superposition by Jean-François Couchot( Book )

2 editions published in 2006 in French and held by 2 WorldCat member libraries worldwide

Safe disassociation of set-valued datasets by Nancy Awad( )

1 edition published in 2019 in English and held by 2 WorldCat member libraries worldwide

Security analysis of steganalyzers by Yousra Ahmed Fadil( )

2 editions published in 2017 in English and held by 2 WorldCat member libraries worldwide

In the recent time, the field of image steganalysis and steganography became more important due to the development in the Internet domain. It is important to keep in mind that the whole process of steganography and steganalysis can be used for legal or illegal operations like any other applications. The work in this thesis can be divided inthree parts. The first one concentrates on parameters that increase the security of steganography methods against steganalysis techniques. In this contribution the effect of the payload, feature extractions, and group of images that are used in the learning stage and testing stage for the steganalysis system are studied. From simulation, we note that the state of the art steganalyzer fails to detect the presence of a secret message when some parameters are changed. In the second part, we study how the presence of many steganography methods may influence the detection of a secret message. The work takes into consideration that there is no ideal situation to embed a secret message when the steganographier can use any scheme with any payloads. In the third part, we propose a method to compute an accurate distortion map depending on a second order derivative of the image. The second order derivative is used to compute the level curve and to embed the message on pixels outside clean level curves. The results of embedding a secret message with our method demonstrate that the result is acceptable according to state of the art steganography
Ancestral Reconstruction and Investigations of Genomics Recombination on Chloroplasts Genomes by Bashar Al-Nuaimi( )

2 editions published in 2017 in English and held by 2 WorldCat member libraries worldwide

The theory of evolution is based on modern biology. All new species emerge of an existing species. As a result, different species share common ancestry,as represented in the phylogenetic classification. Common ancestry may explainthe similarities between all living organisms, such as general chemistry, cell structure,DNA as genetic material and genetic code. Individuals of one species share the same genes but (usually) different allele sequences of these genes. An individual inheritsalleles of their ancestry or their parents. The goal of phylogenetic studies is to analyzethe changes that occur in different organisms during evolution by identifying therelationships between genomic sequences and determining the ancestral sequences and theirdescendants. A phylogeny study can also estimate the time of divergence betweengroups of organisms that share a common ancestor. Phylogenetic trees are usefulin the fields of biology, such as bioinformatics, for systematic phylogeneticsand comparative. The evolutionary tree or the phylogenetic tree is a branched exposure the relationsevolutionary between various biological organisms or other existence depending on the differences andsimilarities in their genetic characteristics. Phylogenetic trees are built infrom molecular data such as DNA sequences and protein sequences. Ina phylogenetic tree, the nodes represent genomic sequences and are calledtaxonomic units. Each branch connects two adjacent nodes. Each similar sequencewill be a neighbor on the outer branches, and a common internal branch will link them to acommon ancestor. Internal branches are called hypothetical taxonomic units. Thus,Taxonomic units gathered in the tree involve being descended from a common ancestor. Ourresearch conducted in this dissertation focuses on improving evolutionary prototypesappropriate and robust algorithms to solve phylogenetic inference problems andancestral information about the order of genes and DNA data in the evolution of the complete genome, as well astheir applications
On the reconstruction of the ancestral bacterial genomes in genus Mycobacterium and Brucella by Christophe Guyeux( )

1 edition published in 2018 in English and held by 2 WorldCat member libraries worldwide

Blind digital watermarking in PDF documents using Spread Transform Dither Modulation by Ahmad W Bitar( )

1 edition published in 2015 in English and held by 2 WorldCat member libraries worldwide

Combining approaches for predicting genomic evolution by Bassam Alkindy( Book )

2 editions published in 2015 in English and held by 2 WorldCat member libraries worldwide

In Bioinformatics, understanding how DNA molecules have evolved over time remains an open and complex problem. Algorithms have been proposed to solve this problem, but they are limited either to the evolution of a given character (forexample, a specific nucleotide), or conversely focus on large nuclear genomes (several billion base pairs), the latter havingknown multiple recombination events - the problem is NP complete when you consider the set of all possible operationson these sequences, no solution exists at present. In this thesis, we tackle the problem of reconstruction of ancestral DNAsequences by focusing on the nucleotide chains of intermediate size, and have experienced relatively little recombinationover time: chloroplast genomes. We show that at this level the problem of the reconstruction of ancestors can be resolved, even when you consider the set of all complete chloroplast genomes currently available. We focus specifically on the orderand ancestral gene content, as well as the technical problems this raises reconstruction in the case of chloroplasts. Weshow how to obtain a prediction of the coding sequences of a quality such as to allow said reconstruction and how toobtain a phylogenetic tree in agreement with the largest number of genes, on which we can then support our back in time- the latter being finalized. These methods, combining the use of tools already available (the quality of which has beenassessed) in high performance computing, artificial intelligence and bio-statistics were applied to a collection of more than450 chloroplast genomes
Comparison of metaheuristics to measure gene effects on phylogenetic supports and topologies by Régis Garnier( )

1 edition published in 2018 in English and held by 2 WorldCat member libraries worldwide

Collaborative multimedia sensors for a connected and smart city by Nesrine Khernane( Book )

2 editions published in 2018 in English and held by 2 WorldCat member libraries worldwide

En raison de leur fort potentiel applicatif dans différents domaines innovants (télésurveillance, télémédecine, etc.), les réseaux de capteurs multimédias sans fil (RCMSF) suscitent l'intérêt de nombreux travaux de recherche. En outre des contraintes soulevées par les réseaux de capteurs scalaires, les RCMSF imposent de nouvelles contraintes liées à la nature même des données capturées et manipulées. En effet, les données multimédias sont, sans aucune mesure, très largement volumineuses en comparaison aux données scalaires. De plus, leur contenu sémantique, très riche, dépendent de la qualité de l'acquisition. Dans le cadre de cette thèse, nous nous sommes intéressés à la problématique pratique d'un réseau de capteurs multimédias permettant de renseigner les automobilistes en temps réel sur les places de parking disponibles au niveau d'une ville, voire d'une agglomération. Cependant, de manière générale, les approches proposées dans nos travaux concernent tout RCMSF de surveillance.Dans ce contexte, l'objectif principal reste de maximiser la durée de vie du réseau tout en assurant une qualité perçue acceptable au niveau de la destination et ce sous un contrôle distribué (pour des raisons de passage à l'échelle évidentes). Deux axes sont à considérer : le traitement des données à la source et leur routage.Dans l'axe traitement de données, le problème principal réside dans la « qualité » des données à transmettre. De manière générale, plus la qualité est importante, plus les données sont volumineuses et conséquemment la consommation énergétique est importante et vice-versa. Il s'agit donc de trouver un équilibre qui préserve les ressources énergétiques, c-à-d. maximiser sa durée de vie tout en assurant une qualité acceptable des données envoyées. Cette dernière est le résultat d'un processus d'encodage au niveau de la source.Ainsi, nous avons d'abord abordé l'axe de traitement de données et proposé un algorithme complètement distribué qui maximise la durée de vie du réseau tout en assurant de manière optimale un équilibre entre la puissance d'encodage au niveau de la source et la qualité vidéo exigée au niveau de la destination. Contrairement aux approches existantes, notre algorithme, de nature distribuée, est assuré de trouver un tel compromis quelle que soit la configuration initiale du réseau.En raison de la complexité de ce problème, notamment dans un contexte décentralisé, les travaux antérieurs n'ont traité que la partie traitement de données indépendamment du routage. En d'autres termes, le routage a été considéré comme une entrée.Dans les travaux de recherche de cette thèse, nous avons par la suite montré que le routage impacte directement les résultats du processus de prolongation de la durée de vie du réseau. En effet, nous avons analysé le comportement de plusieurs protocoles de routage dans les RCMSF et les résultats obtenus ont mis en exergue cette influence. Nous avons donc proposé un modèle analytique intégrant de facto et le codage des données au niveau des sources et leur routage jusqu'à la station de traitement. Nous avons développé une résolution semi-distribuée de ce problème. Les résultats obtenus étaient très encourageants.Ainsi, dans la deuxième partie, une solution entièrement distribuée a été proposée, dans laquelle, l'axe de routage ne peut pas être réalisé sans les paramètres déterminés et mis à jour par l'axe de traitement de données, et inversement. La solution proposée permet: a) un routage de bout en bout avec des décisions locales dans chaque nœud capteur et b) de déterminer le nombre suffisant de chemins nécessaires pour assurer une transmission fiable de données.Pour la suite, nous avons complété nos travaux en considérant plus de contraintes réalistes, notamment la fiabilité des liens ainsi que la variation de leurs capacités (en fonction de l'énergie restante des nœuds intermédiaire). Les résultats de simulation ont montré une économie d'environs 25% de l'énergie totale
La Tuberculose pulmonaire chronique : à propos de 50 malades, hospitalisés dans un servie de pneumo-phtisiologie au Centre hospitalier de Bayonne, 1975-1976 by Jean-François Couchot( Book )

1 edition published in 1978 in French and held by 1 WorldCat member library worldwide

Systèmes Cyber-Physiques autonomes et communicants en milieux inconnus : application à l'exploration par robots mobiles. by Virgile Daugé( )

1 edition published in 2021 in French and held by 1 WorldCat member library worldwide

Ce document retrace l'ensemble du travail et des contributions effectuées dans le cadre de l'exploration robotique autonome d'environnements inconnus. Une analyse de l'état de l'art nous a poussé à nous éloigner des paradigmes classiques des algorithmes de positionnement et cartographie simultanés (SLAM en anglais). Ces approches nécessitent en effet un lourd paramétrage maîtrisable uniquement par des spécialistes. De plus, les systèmes actuels sont tous soumis au problème d'association de données. Le système de positionnement proposé s'inspire ainsi de techniques ancestrales de géomètres, ainsi que de la mesure en coopération active entre les bateaux et les phares. Ainsi, l'emploi de plusieurs robots mesurant collaborativement leurs positions relatives, formant ainsi un réseau de positionnement a permis de construire un système de positionnement robuste, indépendant de l'environnement. Ce travail présente l'évaluation du système, ainsi que son application à la cartographie, conduisant à la mise en place d'un système de SLAM complet. Des algorithmes originaux permettant l'optimisation du placement des éléments formant le réseau de positionnement dans l'espace sont également présentés. Le travail présenté a ainsi abouti à un prototype fonctionnel, testé et évalué dans des conditions réelles
moreShow More Titles
fewerShow Fewer Titles
Audience Level
Audience Level
  General Special  
Audience level: 0.94 (from 0.92 for Collaborat ... to 0.98 for La Tubercu ...)

English (19)

French (4)