Publications

Sanchez A, Madeira A, Barbosa LS.  2015.  On the verification of architectural reconfigurations. Computer Languages, Systems & Structures. 44:218-237.smb15.pdf
Oliveira N, Silva A, Barbosa LS.  2015.  IMCReo: interactive Markov chains for Stochastic Reo. Journal of Internet Services and Information Security (JISIS). 5(1):3-28.osb15.pdf
Oliveira N, Barbosa LS.  2015.  Self-adaptation by coordination-targeted reconfigurations. Journal of Software Engineering Research and Development. 3(6):2-31. Abstractoba15-1.pdf

Background: A software system is self-adaptive when it is able to dynamically and autonomously respond to changes detected either in its internal components or in its deployment environment. This response is expected to ensure the continuous availability of the system by maintaining its functional and non-functional requirements.

Methods: Since these systems are usually distributed, coordination middleware (typically a centralised architectural entity) plays a definitive role in establishing the system goals. For these reasons, adaptations may be triggered at coordination level, issuing reconfigurations to such a coordination entity. However, predicting when exactly reconfigurations are needed, and if they will lead the system into a non disruptive configuration, is still an issue at this level. This paper builds on a framework for formal verification of architectural requirements, either from a qualitative or quantitative (probabilistic) point of view, which will leverage analysis and adaptation prediction.

Results: In order to address the mentioned difficulties, it is discussed both a model that lays down reconfiguration strategies, planned at design time, and a process that actively uses such a model to trigger coordination-targeted reconfigurations at run time. Moreover, a cloud-based architecture for the implementation of this strategy is proposed, as an attempt to deliver adaptation as a service. A case study is presented that assesses the suitability of the approach for real-world software systems.

Conclusions: We highlight the use of formal models to represent the coordination layer and necessary reconfigurations of a software system, and also to predict the need for (and to trigger) adaptations.

Ferreira JF, Mendes A.  2015.  A calculational approach to path-based properties of the Eisenstein–Stern and Stern–Brocot trees via matrix algebra. Journal of Logical and Algebraic Methods in Programming. Abstract2015-ferreiramendes-calcpathbased-final.pdf

This paper proposes a calculational approach to prove properties of two well-known binary trees used to enumerate the rational numbers: the Stern–Brocot tree and the Eisenstein–Stern tree (also known as Calkin–Wilf tree). The calculational style of reasoning is enabled by a matrix formulation that is well-suited to naturally formulate path-based properties, since it provides a natural way to refer to paths in the trees.

Three new properties are presented. First, we show that nodes with palindromic paths contain the same rational in both the Stern–Brocot and Eisenstein–Stern trees. Second, we show how certain numerators and denominators in these trees can be written as the sum of two squares x2 and y2, with the rational View the MathML source appearing in specific paths. Finally, we show how we can construct Sierpiński's triangle from these trees of rationals.

Fischer S, Hu Z, Pacheco H.  2015.  The essence of bidirectional programming. Science China Information Sciences. 58:1–21. AbstractWebsite
n/a
Campos JC, Doherty G, Harrison M.  2014.  Analysing interactive devices based on information resource constraints. International Journal of Human-Computer Studies. 72(3):284-297. Abstract1-s2.0-s1071581913001407-main.pdf

Analysis of the usability of an interactive system requires both an understanding of how the system is to be used and a means of assessing the system against that understanding. Such analytic assessments are particularly important in safety-critical systems as latent vulnerabilities may exist which have negative consequences only in certain circumstances. Many existing approaches to assessment use tasks or scenarios to provide explicit representation of their understanding of use. These normative user behaviours have the advantage that they clarify assumptions about how the system will be used but have the disadvantage that they may exclude many plausible deviations from these norms. Assessments of how a design fails to support these user behaviours can be a matter of judgement based on individual experience rather than evidence. We present a systematic formal method for analysing interactive systems that is based on constraints rather than prescribed behaviour. These constraints capture precise assumptions about what information resources are used to perform action. These resources may either reside in the system itself or be external to the system. The approach is applied to two different medical device designs, comparing two infusion pumps currently in common use in hospitals. Comparison of the two devices is based on these resource assumptions to assess consistency of interaction within the design of each device.

Gomes T, Abade T, Campos JC, Harrison M, Silva JL.  2014.  A Virtual Environment based Serious Game to Support Health Education. EAI Endorsed Transactions on Ambient Systems. 14(3):e5. Abstractamsys.1.3.e5.pdf

APEX was developed as a framework for ubiquitous computing (ubicomp) prototyping through virtual environments. In this paper the framework is used as a platform for developing a serious game designed to instruct and to inform. The paper describes the Asthma game, a game aimed at raising awareness among children of asthma triggers in the home. It is designed to stimulate a healthier life-style for those with asthma and respiratory problems. The game was developed as the gamification of a checklist for the home environment of asthma patients.

Silva JL, Campos JC, Harrison M.  2014.  Prototyping and Analysing Ubiquitous Computing Environments using Multiple Layers. International Journal of Human-Computer Studies. 72(5):488-506. Abstractijhcs-silvach14-preprint.pdf

If ubiquitous computing (ubicomp) is to enhance physical environments then early and accurate assessment of alternative solutions will be necessary to avoid costly deployment of systems that fail to meet requirements. This paper presents APEX, a prototyping framework that combines a 3D Application Server with a behaviour modeling tool. The contribution of this framework is that it allows exhaustive analysis of the behaviour models that drive the prototype while at the same time enabling immersive exploration of a virtual environment simulating the proposed system. The development of prototypes is supported through three layers: a simulation layer (using OpenSimulator); a modelling layer (using CPN Tools) and a physical layer (using external devices and real users). APEX allows movement between these layers to analyse different features, from user experience to user behaviour. The multi layer approach makes it possible to express user behaviour in the modelling layer, provides a way to reduce the number of real users needed by adding simulated avatars, and supports user testing of hybrids of virtual and real components as well as exhaustive analysis. This paper demonstrates the approach by means of an example, placing particular emphasis on the simulation of virtual environments, low cost prototyping and the formal analysis capabilities.

Oliveira JN.  2014.  A relation-algebraic approach to the 'Hoare logic' of functional dependencies. JLAP - The Journal of Logic and Algebraic Programming. Abstract1-s2.0-s1567832614000149-main.pdf

Abstract algebra has the power to unify seemingly disparate theories once they are encoded into the same abstract formalism. This paper shows how a relation-algebraic rendering of both database dependency theory and Hoare programming logic purports one such unification, in spite of the latter being an algorithmic theory and the former a data theory. The approach equips relational data with \emph{functional types} and an associated type system which is useful for database operation type checking and optimization. The prospect of a generic, unified approach to both programming and data theories on top of libraries already available in automated deduction systems is envisaged.

Cunha J, Saraiva JA, Visser J.  2014.  Model-Based Programming Environments for Spreadsheets. Science of Computer Programming . 96:254-275. Abstractjscp-si-sblp12.pdf

Spreadsheets can be seen as a flexible programming environment. However, they lack some of the concepts of regular programming languages, such as structured data types. This can lead the user to edit the spreadsheet in a wrong way and perhaps cause corrupt or redundant data. We devised a method for extraction of a relational model from a spreadsheet and the subsequent embedding of the model back into the spreadsheet to create a model-based spreadsheet programming environment. The extraction algorithm is specific for spreadsheets since it considers particularities such as layout and column arrangement. The extracted model is used to generate formulas and visual elements that are then embedded in the spreadsheet helping the user to edit data in a correct way. We present preliminary experimental results from applying our approach to a sample of spreadsheets from the EUSES Spreadsheet Corpus. Finally, we conduct the first systematic empirical study to assess the effectiveness and efficiency of this approach. A set of spreadsheet end users worked with two different model-based spreadsheets, and we present and analyze here the results achieved.

Neves R, Madeira A, Martins M, Barbosa LS.  2014.  An institution for alloy and its translation to second-order logic. Integration of Reusable Systems . 263:45-75. Abstractalloyfamalm.pdf

Lightweight formal methods, of which Alloy is a prime example, combine the rigour of mathematics without compromising simplicity of use and suitable tool support. In some cases, however, the verification of safety or mission critical software entails the need for more sophisticated technologies, typically based on theorem provers. This explains a number of attempts to connect Alloy to specific theorem provers documented in the literature. This chapter, however, takes a different perspective: instead of focusing on one more combination of Alloy with still another prover, it lays out the foundations to fully integrate this system in the Hets platform which supports a huge network of logics, logic translators and provers. This makes possible for Alloy specifications to “borrow” the power of several, non dedicated proof systems. The chapter extends the authors’ previous work on this subject by developing in full detail the semantical foundations for this integration, including a formalisation of Alloy as an institution, and introducing a new, more general translation of the latter to second-order logic.

Diaconescu R, Madeira A.  2014.  Encoding Hybridised Institutions into First Order Logic. Mathematical Structures in Computer Science. :1-44. Abstractmscsversion.pdf

A ‘hybridization’ of a logic, referred to as the base logic, consists of developing the characteristic features of hybrid logic on top of the respective base logic, both at the level of syntax (i.e. modalities, nominals, etc.) and of the semantics (i.e. possible worlds). By ‘hybridized institutions’ we mean the result of this process when logics are treated abstractly as institutions (in the sense of the institution theory of Goguen and Burstall). This work develops encodings of hybridized institutions into (many-sorted) first order logic (abbreviated FOL) as a ‘hybridization’ process of abstract encodings of institutions into FOL, which may be seen as an abstraction of the well known standard translation of modal logic into first order logic. The concept of encoding employed by our work is that of comorphism from institution theory, which is a rather comprehensive concept of encoding as it features encodings both of the syntax and of the semantics of logics/institutions. Moreover we consider the so-called theoroidal version of comorphisms that encode signatures to theories, a feature that accommodates a wide range of concrete applications. Our theory is also general enough to accomodate various constraints on the possible worlds semantics as well a wide variety of quantifications. We also provide pragmatic sufficient conditions for the conservativity of the encodings to be preserved through the hybridization process, which provides the possibility to shift a formal verification process from the hybridized institution to FOL.

Martins M, Madeira A, Barbosa LS.  2014.  The role of logical interpretations on program development. Logical Methods in Computer Science . 10(1):1-39. Abstract1311.7090.pdf

Stepwise refinement of algebraic specifications is a well known formal methodology for program development. However, traditional notions of refinement based on signature morphisms are often too rigid to capture a number of relevant transformations in the context of software design, reuse, and adaptation. This paper proposes a new approach to refinement in which signature morphisms are replaced by logical interpretations as a means to witness refinements. The approach is first presented in the context of equational logic, and later generalised to deductive systems of arbitrary dimension. This allows, for example, refining sentential into equational specifications and the latter into modal ones.