HASLab - Indexed Articles in Journals
Permanent URI for this collection
Browse
Browsing HASLab - Indexed Articles in Journals by Title
Results Per Page
Sort Options
-
ItemAI for the Win: Improving Spectrum-based Fault Localization( 2012) Birgit Hofer ; Franz Wotawa ; Rui MaranhãoA considerable amount of time in software engineering is spent in debugging. In practice, mainly debugging tools which allow for executing a program step-by-step and setting break points are used. This debugging method is however very time consuming and cumbersome. There is a need for tools which undertake the task of narrowing down the most likely fault locations. These tools must complete this task with as little user interaction as possible and the results computed must be beneficial so that such tools appeal to programmers. In order to come up with such tools, we present three variants of the well-known spectrum-based fault localization technique that are enhanced by using methods from Artificial Intelligence. Each of the three combined approaches outperforms the underlying basic method concerning diagnostic accuracy. Hence, the presented approaches support the hypothesis that combining techniques from different areas is beneficial. In addition to the introduction of these techniq
-
ItemAlloy Meets the Algebra of Programming: A Case Study( 2013) José Nuno Oliveira ; Ferreira,MARelational algebra offers to software engineering the same degree of conciseness and calculational power as linear algebra in other engineering disciplines. Binary relations play the role of matrices with similar emphasis on multiplication and transposition. This matches with Alloy's lemma "everything is a relation" and with the relational basis of the Algebra of Programming (AoP). Altogether, it provides a simple and coherent approach to checking and calculating programs from abstract models. In this paper, we put Alloy and the Algebra of Programming together in a case study originating from the Verifiable File System mini-challenge put forward by Joshi and Holzmann: verifying the refinement of an abstract file store model into a journaled (FLASH) data model catering to wear leveling and recovery from power loss. Our approach relies on diagrams to graphically express typed assertions. It interweaves model checking (in Alloy) with calculational proofs in a way which offers the best of both worlds. This provides ample evidence of the positive impact in software verification of Alloy's focus on relations, complemented by induction-free proofs about data structures such as stores and lists.
-
ItemAnalysing interactive devices based on information resource constraints( 2014) José Creissac Campos ; Doherty,G ; Michael Douglas HarrisonAnalysis of the usability of an interactive system requires both an understanding of how the system is to be used and a means of assessing the system against that understanding. Such analytic assessments are particularly important in safety-critical systems as latent vulnerabilities may exist which have negative consequences only in certain circumstances. Many existing approaches to assessment use tasks or scenarios to provide explicit representation of their understanding of use. These normative user behaviours have the advantage that they clarify assumptions about how the system will be used but have the disadvantage that they may exclude many plausible deviations from these norms. Assessments of how a design fails to support these user behaviours can be a matter of judgement based on individual experience rather than evidence. We present a systematic formal method for analysing interactive systems that is based on constraints rather than prescribed behaviour. These constraints capture precise assumptions about what information resources are used to perform action. These resources may either reside in the system itself or be external to the system. The approach is applied to two different medical device designs, comparing two infusion pumps currently in common use in hospitals. Comparison of the two devices is based on these resource assumptions to assess consistency of interaction within the design of each device.
-
ItemAnomaly Detection and Modeling in 802.11 Wireless Networks( 2019) Anisa Allahdadidastjerdi ; Ricardo Morla ; 3645 ; 5587
-
ItemAssertion-based Slicing and Slice Graphs( 2012) Daniela Cruz ; José Bernardo Barros ; Jorge Sousa Pinto ; Pedro Rangel HenriquesThis paper revisits the idea of slicing programs based on their axiomatic semantics, rather than using criteria based on control/data dependencies. We show how the forward propagation of preconditions and the backward propagation of post conditions can be combined in a new slicing algorithm that is more precise than the existing specification-based algorithms. The algorithm is based on (i) a precise test for removable statements, and (ii) the construction of a slice graph, a program control flow graph extended with semantic labels. It improves on previous approaches in two aspects: it does not fail to identify removable commands; and it produces the smallest possible slice that can be obtained (in a sense that will be made precise). The paper also reviews in detail, through examples, the ideas behind the use of preconditions and post conditions for slicing programs.
-
ItemBalancing the formal and the informal in user-centred design( 2021) José Creissac Campos ; Harrison,MD ; Masci,P ; 5599
-
ItemBalancing the Formal and the Informal in User-centred Design( 2021) Michael Douglas Harrison ; José Creissac Campos ; 6421 ; 5599This paper explores the role of formal methods as part of the user-centred design of interactive systems. An iterative process is described, developing prototypes incrementally, proving user-centred requirements while at the same time evaluating the prototypes that are executable forms of the developed models using 'traditional' techniques for user evaluation. A formal analysis complements user evaluations. This approach enriches user-centred design that typically focuses understanding on context and producing sketch designs. These sketches are often non-functional (e.g. paper) prototypes. They provide a means of exploring candidate design possibilities using techniques such as cooperative evaluation. This paper describes a further step in the process using formal analysis techniques. The use of formal methods provides a systematic approach to checking plausibility and consistency during early design stages, while at the same time enabling the generation of executable prototypes. The technique is illustrated through an example based on a pill dispenser.
-
ItemThe benefits of formalising design guidelines: a case study on the predictability of drug infusion pumps( 2015) Paolo Masci ; Ruksenas,Rimvydas ; Oladimeji,Patrick ; Cauchi,Abigail ; Gimblett,Andy ; Li,KarenYunqiu ; Curzon,Paul ; Thimbleby,HaroldW.A demonstration is presented of how automated reasoning tools can be used to check the predictability of a user interface. Predictability concerns the ability of a user to determine the outcomes of their actions reliably. It is especially important in situations such as a hospital ward where medical devices are assumed to be reliable devices by their expert users (clinicians) who are frequently interrupted and need to quickly and accurately continue a task. There are several forms of predictability. A definition is considered where information is only inferred from the current perceptible output of the system. In this definition, the user is not required to remember the history of actions that led to the current state. Higher-order logic is used to specify predictability, and the Symbolic Analysis Laboratory is used to automatically verify predictability on real interactive number entry systems of two commercial drug infusion pumps—devices used in the healthcare domain to deliver fluids (e.g., medications, nutrients) into a patient’s body in controlled amounts. Areas of unpredictability are precisely identified with the analysis. Verified solutions that make an unpredictable system predictable are presented through design modifications and verified user strategies that mitigate against the identified issues. © 2013, Springer-Verlag London.
-
ItemBerry: A code for the differentiation of Bloch wavefunctions from DFT calculations( 2024) André Martins Pereira ; 9080Density functional calculation of electronic structures of materials is one of the most used techniques in theoretical solid state physics. These calculations retrieve single electron wavefunctions and their eigenenergies. The berry suite of programs amplifies the usefulness of DFT by ordering the eigenstates in analytic bands, allowing the differentiation of the wavefunctions in reciprocal space. It can then calculate Berry connections and curvatures and the second harmonic generation conductivity. The berry software is implemented for two dimensional materials and was tested in hBN and InSe. In the near future, more properties and functionalities are expected to be added.Program summary Program Title: berry CPC Library link to program files: https://doi .org /10 .17632 /mpbbksz2t7 .1 Developer's repository link: https://github .com /ricardoribeiro -2020 /berry Licensing provisions: MIT Programming language: Python3 Nature of problem: Differentiation of Bloch wavefunctions in reciprocal space, numerically obtained from a DFT software, applied to two dimensional materials. This enables the numeric calculation of material's properties such as Berry geometries and Second Harmonic conductivity. Solution method: Extracts Kohn-Sham functions from a DFT calculation, orders them by analytic bands using graph and AI methods and calculates the gradient of the wavefunctions along an electronic band. Additional comments including restrictions and unusual features: Applies only to two dimensional materials, and only imports Kohn-Sham functions from Quantum Espresso package.
-
ItembGSL: An imperative language for specification and refinement of backtracking programs( 2023) Alexandra Sofia Mendes ; 7344We present an imperative refinement language for the development of backtracking programs and discuss its semantic foundations. For expressivity, our language includes prospective values and preference - the latter being a variant of Nelson's biased choice that backtracks from infeasibility of a continuation. Our key contribution is to examine feasibility-preserving refinement as a basis for developing backtracking programs, and several key refinement laws that enable compositional refinement in the presence of non -monotonic program combinators.
-
ItemBoolean Searchable Symmetric Encryption with Filters on Trusted Hardware( 2020) Portela,B ; Leitao,J ; Domingos,HJ ; Borges,G ; Tiago Filipe Oliveira ; Ferreira,B ; 6207
-
ItemA Calculus for Generic, QoS-Aware Component Composition( 2012) Luís Soares Barbosa ; Sun MengSoftware QoS properties, such as response time, availability, bandwidth requirement, memory usage, among many others, play a major role in the processes of selecting and composing software components. This paper extends a component calculus to deal, in an effective way, with them. The calculus models components as gener- alised Mealy machines, i.e., state-based entities interacting along their life time through well defined interfaces of observers and actions. QoS is introduced through an algebraic structure specifying the relevant QoS domain and how its values are composed under different disciplines. A major effect of introducing QoS-awareness is that a number of equivalences holding in the plain calculus become refinement laws. The paper also introduces a prototyper for the calculus developed as a 'proof-of-concept' implementation
-
ItemCAOVerif: An open-source deductive verification platform for cryptographic software implementations( 2014) José Bacelar Almeida ; Manuel Barbosa ; Filliatre,JC ; Jorge Sousa Pinto ; Vieira,BCAO is a domain-specific imperative language for cryptography, offering a rich mathematical type system and crypto-oriented language constructions. We describe the design and implementation of a deductive verification platform for CAO and demonstrate that the development time of such a complex verification tool could be greatly reduced by building on the Jessie plug-in included in the Frama-C framework. We discuss the interesting challenges raised by the domain-specific characteristics of CAO, and describe how we tackle these problems in our design. We base our presentation on real-world examples of CAO code, extracted from the open-source code of the NaCl cryptographic library, and illustrate how various cryptography-relevant security properties can be verified.
-
ItemA Case for Partitioned Bloom Filters( 2023) Paulo Sérgio Almeida ; 5607In a partitioned Bloom Filter (PBF) the bit vector is split into disjoint parts, one per hash function. Contrary to hardware designs, where they prevail, software implementations mostly ignore PBFs, considering them worse than standard Bloom filters (SBF), due to the slightly larger false positive rate (FPR). In this paper, by performing an in-depth analysis, first we show that the FPR advantage of SBFs is smaller than thought; more importantly, by deriving the per-element FPR, we show that SBFs have weak spots in the domain: elements that test as false positives much more frequently than expected. This is relevant in scenarios where an element is tested against many filters. Moreover, SBFs are prone to exhibit extremely weak spots if naive double hashing is used, something occurring in mainstream libraries. PBFs exhibit a uniform distribution of the FPR over the domain, with no weak spots, even using naive double hashing. Finally, we survey scenarios beyond set membership testing, identifying many advantages of having disjoint parts, in designs using SIMD techniques, for filter size reduction, test of set disjointness, and duplicate detection in streams. PBFs are better, and should replace SBFs, in general purpose libraries and as the base for novel designs.
-
ItemA Case for Partitioned Bloom Filters( 2022) Paulo Sérgio Almeida ; 5607
-
ItemA Case for Partitioned Bloom Filters( 2023) Paulo Sérgio Almeida ; 5607In a partitioned Bloom Filter (PBF) the bit vector is split into disjoint parts, one per hash function. Contrary to hardware designs, where they prevail, software implementations mostly ignore PBFs, considering them worse than standard Bloom filters (SBF), due to the slightly larger false positive rate (FPR). In this paper, by performing an in-depth analysis, first we show that the FPR advantage of SBFs is smaller than thought; more importantly, by deriving the per-element FPR, we show that SBFs have weak spots in the domain: elements that test as false positives much more frequently than expected. This is relevant in scenarios where an element is tested against many filters. Moreover, SBFs are prone to exhibit extremely weak spots if naive double hashing is used, something occurring in mainstream libraries. PBFs exhibit a uniform distribution of the FPR over the domain, with no weak spots, even using naive double hashing. Finally, we survey scenarios beyond set membership testing, identifying many advantages of having disjoint parts, in designs using SIMD techniques, for filter size reduction, test of set disjointness, and duplicate detection in streams. PBFs are better, and should replace SBFs, in general purpose libraries and as the base for novel designs.
-
ItemCertified computer-aided cryptography: efficient provably secure machine code from high-level implementations( 2013) Manuel Barbosa ; José Bacelar Almeida ; Barthe,G ; Dupressoir,F ; 5604 ; 5598
-
ItemCloudMdsQL: querying heterogeneous cloud data stores with a common language( 2016) Kolev,B ; Valduriez,P ; Bondiombouy,C ; Jimenez Peris,R ; Pau,R ; José Orlando PereiraThe blooming of different cloud data management infrastructures, specialized for different kinds of data and tasks, has led to a wide diversification of DBMS interfaces and the loss of a common programming paradigm. In this paper, we present the design of a cloud multidatastore query language (CloudMdsQL), and its query engine. CloudMdsQL is a functional SQL-like language, capable of querying multiple heterogeneous data stores (relational and NoSQL) within a single query that may contain embedded invocations to each data store's native query interface. The query engine has a fully distributed architecture, which provides important opportunities for optimization. The major innovation is that a CloudMdsQL query can exploit the full power of local data stores, by simply allowing some local data store native queries (e.g. a breadth-first search query against a graph database) to be called as functions, and at the same time be optimized, e.g. by pushing down select predicates, using bind join, performing join ordering, or planning intermediate data shipping. Our experimental validation, with three data stores (graph, document and relational) and representative queries, shows that CloudMdsQL satisfies the five important requirements for a cloud multidatastore query language.
-
ItemCoalgebra for the working software engineer( 2022) Luís Soares Barbosa ; 5603Often referred to as ‘the mathematics of dynamical, state-based systems’, Coalgebra claims to provide a compositional and uniform framework to specify, analyse and reason about state and behaviour in computing. This paper addresses this claim by discussing why Coalgebra matters for the design of models and logics for computational phenomena. To a great extent, in this domain one is interested in properties that are preserved along the system’s evolution, the so-called ‘business rules’ or system’s invariants, as well as in liveness requirements, stating that e.g. some desirable outcome will be eventually produced. Both classes are examples of modal assertions, i.e. properties that are to be interpreted across a transition system capturing the system’s dynamics. The relevance of modal reasoning in computing is witnessed by the fact that most university syllabi in the area include some incursion into modal logic, in particular in its temporal variants. The novelty is that, as it happens with the notions of transition, behaviour, or observational equivalence, modalities in Coalgebra acquire a shape. That is, they become parametric on whatever type of behaviour, and corresponding coinduction scheme, seems appropriate for addressing the problem at hand. In this context, the paper revisits Coalgebra from a computational perspective, focussing on three topics central to software design: how systems are modelled, how models are composed, and finally, how properties of their behaviours can be expressed and verified. © 2022, College Publications. All rights reserved.
-
ItemA coalgebraic perspective on linear weighted automata( 2012) Filippo Bonchi ; Alexandra Silva ; Jan Rutten ; Michele Boreale ; Marcello BonsangueWeighted automata are a generalisation of non-deterministic automata where each transition, in addition to an input letter, has also a quantity expressing the weight (e.g. cost or probability) of its execution. As for non-deterministic automata, their behaviours can be expressed in terms of either (weighted) bisimilarity or (weighted) language equivalence. Coalgebras provide a categorical framework for the uniform study of state-based systems and their behaviours. In this work, we show that coalgebras can suitably model weighted automata in two different ways: coalgebras on SetSet (the category of sets and functions) characterise weighted bisimilarity, while coalgebras on VectVect (the category of vector spaces and linear maps) characterise weighted language equivalence. Relying on the second characterisation, we show three different procedures for computing weighted language equivalence. The first one consists in a generalisation of the usual partition refinement algorithm for ordi