Tesi di Dottorato

Permanent URI for this communityTesi di Dottorato

Browse

Search Results

Now showing 1 - 10 of 51
  • Item
    Declarative solutions for the the Manipulation of Articulated Objects Using Dual-Arm Robots
    (Università della Calabria, 2020-03-17) Bertolucci, Riccardo; Leone, Nicola; Maratea, Marco; Mastrogiovanni, Fulvio
    The manipulation of exible object is of primary importance in industry 4.0 and in home environments scenarios. Traditionally, this problem has been tackled by developing ad-hoc approaches, that lack of exibility and portability. We propose an approach in which a exible object is modelled as an articulated object, or rather, a set of links connect via joints. In this thesis we present a framework based on Answer Set Programming (ASP) for the automated manipulation of articulated objects in a robot architecture. In particular, ASP is employed for representing the con guration of the articulated object, for checking the consistency of the knowledge base, and for generating the sequence of manipulation actions. The framework is exempli ed and validated on the Baxter dual-arm manipulator on a simple reference scenario in which we carried out di erent experiments analysing the behaviours of di erent strategies for the action planning module. Our aim is to have an understanding of the performances of these approaches with respect to planning time and execution time as well. Then, we extend such scenario for having a higher accuracy of the setup, and to introduce a few constraints in robot actions execution to improve their feasibility. Given the extended scenario entails a high number of possible actions that can be fruitfully combined, we exploit macro actions from automated planning in the module that generates the sequence of actions, in order to deal with this extended scenario in a more e ective way. Moreover, we analyse the possibilities of mixed encodings with both simple actions and macro actions from automated planning in di erent "concentrations". We nally validate the framework also on this extended scenario, con rming the applicability of ASP also in this complex context, and showing the usefulness of macro actions in this robotics application.
  • Item
    Dyadic TGDs - A new paradigm for ontological query answering
    (Università della Calabria, 2022-03-11) Marte, Cinzia; Greco, Gianluigi; Manna, Marco; Guerriero, Francesca; Leone, Nicola
    Ontology-BasedQueryAnswering(OBQA)consistsinqueryingdata– bases bytakingontologicalknowledgeintoaccount.Wefocusona logical frameworkbasedonexistentialrulesor tuple generatingdepen- dencies (TGDs), alsoknownasDatalog±, whichcollectsthebasicde- cidable classesofTGDs,andgeneralizesseveralontologyspecification languages. While thereexistlotsofdifferentclassesintheliterature,inmost cases eachofthemrequiresthedevelopmentofaspecificsolverand, only rarely,thedefinitionofanewclassallowstheuseofexisting systems. Thisgapbetweenthenumberofexistentparadigmsandthe numberofdevelopedtools,promptedustodefineacombinationof Shy and Ward (twowell-knownclassesthatenjoygoodcomputational properties)withtheaimofexploitingthetooldevelopedfor Shy. Nevertheless,studyinghowtomergethesetwoclasses,wehavereal- ized thatitwouldbepossibletodefine,inamoregeneralway,the combinationofexistingclasses,inordertomakethemostofexisting systems. Hence, inthiswork,startingfromtheanalysisofthetwoaforemen- tioned existingclasses,wedefineamoregeneralclass,named Dyadic TGDs, thatallowstoextendinauniformandelegantwayallthede- cidable classes,whileusingtheexistentrelatedsystems.Atthesame time, wedefinealsoacombinationof Shy and Ward, named Ward+, and weshowthatitcanbeseenasaDyadicsetofTGDs. Finally,tosupportthetheoreticalpartofthethesis,weimplementa BCQ evaluationalgorithmfortheclass Ward+, thattakesadvantage of anexistingsolverdevelopedfor Shy.
  • Item
    Ontology-driven information extraction
    (2017-07-20) Adrian, Weronika Teresa; Leone, Nicola; Manna, Marco
    Information Extraction consists in obtaining structured information from unstructured and semi-structured sources. Existing solutions use advanced methods from the field of Natural Language Processing and Artificial Intelligence, but they usually aim at solving sub-problems of IE, such as entity recognition, relation extraction or co-reference resolution. However, in practice, it is often necessary to build on the results of several tasks and arrange them in an intelligent way. Moreover, nowadays, Information Extraction faces new challenges related to the large-scale collections of documents in complex formats beyond plain text. An apparent limitation of existing works is the lack of uniform representation of the document analysis from multiple perspectives, such as semantic annotation of text, structural analysis of the document layout and processing of the integrated knowledge. The recent proposals of ontology-based Information Extraction do not fully exploit the possibilities of ontologies, using them only as a reference model for a single extraction method, such as semantic annotation, or for defining the target schema for the extraction process. In this thesis, we address the problem of Information Extraction from homogeneous collections of documents i.e., sets of files that share some common properties with respect to the content or layout. We observe that interleaving semantic and structural analysis can benefit the results of the IE process and propose an ontology-driven approach that integrates and extends existing solutions. The contributions of this thesis are of theoretical and practical nature. With respect to the first, we propose a model and a process of Semantic Information Extraction that integrates techniques from semantic annotation of text, document layout analysis, object-oriented modeling and rule-based reasoning. We adapt existing solutions to enable their integration under a common ontological view and advance the state-of-the-art in the field of semantic annotation and document layout analysis. In particular, we propose a novel method for automatic lexicon generation for semantic annotators, and an original approach to layout analysis, based on common labels identification and structure recognition. We design and implement a framework named KnowRex that realize the proposed methodology and integrates the elaborated solutions.
  • Item
    CORE: an intelligent trasportation systems in Calabria
    (2017-02-22) Santoro, Francesco; Leone, Nicola; Laganà, Demetrio; Musmanno, Roberto
  • Item
    Paracoherent Answer Set Programming
    (2017-02-22) Amendola, Giovanni; Leone, Nicola; Eiter, Thomas
    L’Answer Set Programming (ASP) è un paradigma di programmazione dichiarativa basato sulla semantica dei modelli stabili. L’idea alla base di ASP è di codificare un problema computazionale in un programma logico i cui modelli stabili, anche detti answer sets, corrispondono alle soluzioni del problema originale. La semantica degli answer sets potrebbe non assegnare alcun modello ad un programma logico a causa di contraddizioni logiche o di negazioni instabili, causate dalla dipendenza ciclica di un atomo dal suo negato. Mentre le contraddizioni logiche possono essere gestite con le tecniche tradizionali usate nel ragionamento paraconsistente, l’instabilità della negazione richiede altri metodi. Ricorriamo qui ad una semantica paracoerente in cui sono utilizzate delle interpretazioni a 3 valori, dove un terzo valore di verità, oltre a quelli classici di vero e di falso, esprime che un atomo è creduto vero. Ciò è alla base della semantica dei modelli semi-stabili che è stata definita attraverso l’utilizzo di una trasformazione del programma originario. Questa tesi ha come punto di partenza un articolo presentato nel 2010 alla dodicesima “International Conference on Principles of Knowledge Representation and Reasoning” [21], dove viene offerta innanzitutto una caratterizzazione dei modelli semi-stabili che rende la semantica più comprensibile. Inoltre, motivati da alcune anomalie di tale semantica rispetto ad alcune fondamentali proprietà epistemiche, viene proposta una correzione che soddisfa queste proprietà. Per questa nuova semantica viene offerta sia una definizione attraverso una trasformazione del programma originario che una caratterizzazione teorica dei modelli, che si rivelano essere un rilassamento della logica di equilibrio, una logica caratterizzante la semantica degli answer sets. Pertanto, la semantica introdotta viene chiamata semantica dei modelli di semi-equilibrio. Nella tesi consideriamo alcuni miglioramenti di questa semantica rispetto alla modularità nelle regole del programma, basata sugli insiemi di splitting, il principale strumento per la modularità usato nel modellare e nel valutare i programmi ASP. Tra questi selezioniamo classi di modelli canonici che permettono una valutazione bottom-up dei programmi ASP, con l’opzione di passare ad una modalità paracoerente quando si rileva la mancanza di un answer set. L’analisi della complessità computazionale dei principali task di ragionamento mostra che i modelli di semi-equilibrio sono computazionalmente più difficili rispetto agli answer sets (ovvero, modelli di equilibrio), a causa di una minimizzazione globale necessaria per mantenere quanto più piccolo possibile il gap tra atomi creduti veri e atomi veri. Successivamente consideriamo differenti algoritmi per calcolare modelli semi-stabili e modelli di semi-equilibrio, implementandoli ed integrandoli all’interno di un framework per la costruzione degli answer sets. Riportiamo poi i risultati di esperimenti condotti sui benchmarks provenienti dalle ASP competitions, identificando l’algoritmo più efficiente. I risultati di questa tesi contribuiscono alla fondazione logica dell’ASP paracoerente, che sta gradualmente ottenendo una maggiore importanza nella gestione delle inconsistenze e, allo stesso tempo, offrono una base per lo sviluppo di algoritmi e per la loro integrazione all’interno di un solver ASP.
  • Item
    <> impact of software on Mathematics education in high schools
    (2017-02-22) Frassia, Maria Giovanna; Leone, Nicola; Serpe, Annarosa
    Il rapporto tra tecnologia e insegnamento-apprendimento della Matematica è un fenomeno complesso che merita di essere osservato da diversi punti di vista. Uno di questi è il ruolo che i software svolgono, e potrebbero svolgere nella pratica didattica di Matematica. Nell’ultimo trentennio, la disponibilità di software nella scuola secondaria di II grado italiana è notevolmente aumentata come in molti altri paesi, questo ha aperto nuove prospettive nel processo di insegnamento e apprendimento della Matematica. A tal riguardo esiste una letteratura di ricerca vasta e crescente relativa all’integrazione di software didattici proprio perché presentano caratteristiche diverse che vanno ad incidere sulle modalità d’uso in classe: non va considerato tanto l’impatto strumentale quanto la dimensione del cambiamento generato nel processo d’insegnamento-apprendimento. Lo scopo di questa tesi è quello di dare un posto centrale allo studio dell’impatto dei software nella didattica della Matematica nella scuola secondaria di II grado A tale ragione sono stati scelti ed esaminati, nelle loro intenzionalità e potenzialità didattiche, due differenti tipologie di software: il primo è il software di geometria dinamica GeoGebra, il secondo è l’ambiente di programmazione MatCos. Lo studio è finalizzato alla disamina degli effetti prodotti dall’impatto dei suddetti software nella didattica dei seguenti argomenti: costruzione geometrica di curve piane e calcolo della probabilità. La scelta degli argomenti è dovuta al fatto che questi nella pratica didattica in classe sono - quasi sempre - trascurati, con conseguente grave perdita nel percorso formativo degli studenti. Lo studio di ricerca ha visto coinvolti un campione di studenti di scuola secondaria di II grado suddiviso in due gruppi: il primo ha utilizzato i due software durante tutte le attività di sperimentazione, mentre il secondo non li ha utilizzati. L’attività di sperimentazione ha consentito di effettuare l’indagine sull’impatto dei due software nella didattica degli argomenti prescelti, nonché sulle differenze, in termini di risultati d’apprendimento, tra i due gruppi di studenti. In definitiva, questo studio d’impatto ha preso in esame l’inquadramento dei risultati ottenuti dagli studenti e osservati sul piano epistemologico, cognitivo e didattico.
  • Item
    Pure strategies in security games
    (2017-07-20) Marek, Adrian; Leone, Nicola; Greco, Gianluigi
    In the past decades game theory allowed to model mathematically conflicts between groups of players, each with their own agenda, and found ways to apply them in real life situations. One of such applications, which has been studied heavily in recent years, are security games. Their primary purpose is to help find a way to distribute scarce resources of an entity whose job is to minimize the potential chances of a successful attack on a system which is under its care. The underlying assumption is that the second player, whose task is to launch an successful attack on the system, has sufficient time to observe the way in which the resources are used to find any patterns, or regularities. These games have been successfully applied to several real life situations like the way checkpoints and canine units are deployed on the LAX airport, the way air field marshals are scheduled to fly on commercial airplanes, or the way patrol routs are made in the Boston port. All of these works concentrate on considering mixed strategies, that is using a random element to decide how the defender manages his resources. While this approach is understandable in the case of one player managing all of the resources, it becomes less obvious in the situation where multiple defenders, with knowledge of other players priorities and limited means to coordinate are given. The problem of finding what can be said about pure strategies in such a situation is the goal of this thesis. This thesis provides a brief overview of what is currently known about modeling security games. The contribution is the following: a model for a multiple defender security game with sequential resource allocation is presented and the notions of reasonable behavior are described; a polynomial algorithm for finding a reasonable move and predicting other players decisions is presented. Moreover, it is proven that even if the actual play of the other players differs from what was predicted, the result will still satisfy the assumption of rationality of the players. Finally, to emulate coordination among the players, the model is expanded by adding an additional player, called the Overseer, whose goal is to ensure that a set of targets is protected by deciding the order in which all the other players commit. It is shown that deciding whether such a sequence of players exist for a set of targets of any size can be reduced to a set of one element in polynomial time. A partial result for which this problem is in P class is shown
  • Item
    Design and implementation of a modern ASP grounder
    (2018-01-19) Zangari, Jessica; Leone, Nicola; Calimeri, Francesco; Perri, Simona
    Answer Set Programming (ASP) is a declarative programming paradigm proposed in the area of non-monotonic reasoning and logic programming in the late '80 and early '90. Thanks to its expressivity and capability of dealing with incomplete knowledge, ASP that became widely used in AI and recognized as a powerful tool for Knowledge Representation and Reasoning (KRR). On the other hand, its high expressivity comes at the price of a high computational cost, thus requiring reliable and high-performance implementations. Throughout the years, a signi cant e ort has been spent in order to de ne techniques for an e cient computation of its semantics. In turn, the availability of e cient ASP systems made ASP a powerful tool for developing advanced applications in many research areas as well as in industrial contexts. Furthermore, a signi cant amount of work has been carried out in order to extend the basic language and ease knowledge representation tasks with ASP, and recently a standard input language, namely ASP-Core-2, has been de ned, also with the aim of fostering interoperability among ASP systems. Although di erent approaches for the evaluation of ASP logic programs have been proposed, the canonical approach, which is adopted in mainstream ASP systems, mimics the de nition of answer set semantics by relying on a grounding module (grounder), that generates a propositional theory semantically equivalent to the input program, coupled with a subsequent module (solver ) that applies propositional techniques for generating its answer sets. The former phase, called grounding or instantiation, plays a key role for the successful deployment in real-world contexts, as in general the produced ground program is potentially of exponential size with respect to the input program, and therefore the subsequent solving step, in the worst case, takes exponential time in the size of the input. To mitigate these issues, modern grounders employ smart procedures to obtain ground programs signi cantly smaller than the theoretical instantiation, in general. This thesis focuses on the ex-novo design and implementation of a new modern and e cient ASP instantiator. To this end, we study a series of techniques geared towards the optimization of the grounding process, questioning the techniques employed by modern grounders with the aim of improving them and introducing further optimization strategies, which lend themselves to the integration into a generic grounder module of a traditional ASP system following a ground & solve approach. In particular, we herein present the novel system I-DLV that incorporates all these techniques leveraging on their synergy to perform an e cient instantiation. The system features full support to ASP-Core-2 standard language, advanced exibility and customizability mechanisms, and is endowed with extensible design that eases the incorporation of language upi dates and optimization techniques. Moreover, its usage is twofold: besides being a stand-alone grounder, it is also a full- edged deductive database engine. In addition, along with the solver wasp it has been integrated in the new version of the widespread ASP system DLV recently released.
  • Item
    Topics in metric fixed point teory and stability of dynamical systems
    (2018-01-19) Zaccone, Roberta; Leone, Nicola; Marino, Giuseppe
    In this thesis we introduce iterative methods approximating fixed points for nonlinear operators defined in infinite-dimensional spaces. The starting points are the Implicit and Explicit Midpoint Rules generating polygonal functions approximating a solution for an ordinary differential equation in finite-dimensional spaces. Our study has the purpose of determining suitable conditions on the mapping, the underlying space, the coefficients defining the method, in order to get strong convergence of the generated sequence to a common solution of a fixed point problem and a variational inequality. The contributions to this topic appear in the papers: G. Marino, R. Zaccone, On strong convergence of some midpoint type methods for nonexpansive mappings, J. Nonlinear Var. Anal., vol. 1 (2017), n. 2, 159-174; G. Marino, B. Scardamaglia, R. Zaccone, A general viscosity explicit midpoint rule for quasi-nonexpansive mappings, J. Nonlinear and Convex Anal., vol. 18 (2017), n. 1, 137-148; J. Garcia-Falset, G. Marino, R. Zaccone, An explicit midpoint algorithm in Banach spaces, to appear in J. Nonlinear and Convex Anal. (2017). Not rarely a fixed point iteration scheme is used to find a stationary state for a dynamical system. However the fixed points may not be stable. In view of this, we study some conditions under which the asymptotic stability for the critical points of a certain dynamical system is ensured. Our contribution to this topic appears in the paper: R. P. Agarwal, G. Marino, H. K. Xu, R. Zaccone, On the dynamics of love: a model including synergism, J. Linear and Nonlinear Anal., vol. 2, n. 1 (2016), 1-16.
  • Item
    Seamless acceleratin numerical regular grid methods on manycore systems
    (2018-01-19) Spataro, Davide; Leone, Nicola; Spataro, William; D'Ambrosio, Donato
    Over the last two decades, a lot has changed regarding the way modern scientific applications are designed, written and executed, especially in the field of data-analytics, scientific computing and visualization. Dedicated computing machines are nowadays large, powerful agglomerates of hundreds or thousands of multi-core computing nodes interconnected via network each coupled with multiple accelerators. Those kinds of parallel machines are very complex and their efficient programming is hard, bug-prone and time-consuming. In the field of scientific computing, and of modeling and simulation especially, parallel machines are used to obtain approximate numerical solutions to differential equations for which the classical approach often fails to solve them analytically making a numerical computer-based approach absolutely necessary. An approximate numerical solution of a partial differential equation can be obtained by applying a number of methods, as the finite element or finite difference method which yields approximate values of the unknowns at a discrete number of points over the domain. When large domains are considered, big parallel machines are required in order to process the resulting huge amount of mesh nodes. Parallel programming is notoriously complex, often requiring great programming efforts in order to obtain efficient solvers targeting large computing cluster. This is especially true since heterogeneous hardware and GPGPU has become mainstream. The main thrust of this work is the creation of a programming abstraction and a runtime library for seamless implementation of numerical methods on regular grids targeting different computer architecture: from commodity single-core laptops to large clusters of heterogeneous accelerators. A framework, OpenCAL had been developed, which exposes a domain specific language for the definition of a large class of numerical models and their subsequent deployment on the targeted machines. Architecture programming details are abstracted from the programmer that with little or no intervention at all can obtain a serial, multi-core, single-GPU, multi- GPUs and cluster of GPUs OpenCAL application. Results show that the framework is effective in reducing programmer effort in producing efficient parallel numerical solvers.