Recent Publications

Kosar T, Oliveira N, Mernik M, Pereira MJV, Črepinšek M, da Cruz D, Henriques PR.  2010.  Comparing General-Purpose and Domain-Specific Languages: An Empirical Study. Computer Science and Information Systems. 7(2):247–264.
Garis A.  2010.  Lógica Temporal en Verificación de Modelos de Software. Origen y Evolución hasta tiempos actuales. Redalyc. XI:151-161. Abstractlogicatemporal.pdf

En sistemas de software críticos, tales como sistemas de control de vuelo de aviones o sistemas de control de ascensores, los errores de software deben ser evitados. En la actualidad, existen técnicas y herramientas para asegurar la correctitud de sistemas críticos, siendo “model checking” (chequeo de modelos) una de las más populares. Con model checking, tanto el sistema de software como las propiedades deseadas del sistema (seguridad, alcanzabilidad, etc.) se modelan utilizando lenguajes de especificación formal; es decir, lenguajes basados en conceptos lógicomatemáticos. Las propiedades son especificadas normalmente a través de un tipo de lógica denominada “lógica temporal”. Profundizando sobre dicha lógica, el presente trabajo muestra los aspectos epistemológicos relacionados, detallando sobre su origen y evolución, desde las nociones de lógicas introducidas por Aristóteles hasta su concepción como herramienta clave para la verificación de sistemas de software críticos.

Rodrigues N.  2010.  Discovering coordination patterns. Electronic Notes in Theoretical Computer Science. 260:189-207.discovering_coordination_patterns.pdf
Barbosa MB, Moss A, Page D.  2009.  Constructive and Destructive Use of Compilers in Elliptic Curve Cryptography. Journal of Cryptology. 22:259-281. Abstractpaper_final_1.pdf

Although cryptographic software implementation is often performed by expert programmers, the range of performance and security driven options, as well as more mundane software engineering issues, still make it a challenge. The use of domain specific language and compiler techniques to assist in description and optimisation of cryptographic software is an interesting research challenge. In this paper we investigate two aspects of such techniques, focusing on Elliptic Curve Cryptography (ECC) in particular. Our constructive results show that a suitable language allows description of ECC based software in a manner close to the original mathematics; the corresponding compiler allows automatic production of an executable whose performance is competitive with that of a hand-optimised implementation. In contrast, we study the worrying potential for naïve compiler driven optimisation to render cryptographic software insecure. Both aspects of our work are set within the context of CACE, an ongoing EU funded project on this general topic.

Martins M, Madeira A, Barbosa LS.  2009.  Refinement by Interpretation in a General Setting. Electronic Notes in Theoretical Computer Science. 259:105-121. Abstractrefine09-mmb.pdf

Refinement by interpretation replaces signature morphisms by logic interpretations as a means to translate specifications and witness refinements. The approach was recently introduced by the authors [13] in the context of equational specifications, in order to capture a number of relevant transformations in software design, reuse and adaptation. This paper goes a step forward and discusses the generalization of this idea to deductive systems of arbitrary dimension. This makes possible, for example, to refine sentential into equational specifications and the latter into modal ones. Moreover, the restriction to logics with finitary consequence relations is dropped which results in increased flexibility along the software development process.

Rodrigues C, Oliveira JN, Barbosa LS.  2009.  A Single Complete Relational Rule for Coalgebraic Refinement. Electronic Notes in Theoretical Computer Science. 259:3-19. Abstractrefine09-rbo.pdf

A transition system can be presented either as a binary relation or as a coalgebra for the powerset functor, each representation being obtained from the other by transposition. More generally, a coalgebra for a functor F generalises transition systems in the sense that a shape for transitions is determined by F, typically encoding a signature of methods and observers. This paper explores such a duality to frame in purely relational terms coalgebraic refinement, showing that relational (data) refinement of transition relations, in its two variants, downward and upward (functional) simulations, is equivalent to coalgebraic refinement based on backward and forward morphisms, respectively. Going deeper, it is also shown that downward simulation provides a complete relational rule to prove coalgebraic refinement. With such a single rule the paper defines a pre-ordered calculus for refinement of coalgebras, with bisimilarity as the induced equivalence. The calculus is monotonic with respect to the main relational operators and arbitrary relator F, therefore providing a framework for structural reasoning about refinement.

Campos M, Barbosa LS.  2009.  Implementation of an Orchestration Language as a Haskell Domain Specific Language. Electronic Notes in Theoretical Computer Science. 255:45-64. Abstractfoclasa-camposbarbosa09.pdf

Even though concurrent programming has been a hot topic of discussion in Computer Science for the past 30 years, the community has yet to settle on a, or a few standard approaches to implement concurrent programs. But as more and more cores inhabit our CPUs and more and more services are made available on the web the problem of coordinating different tasks becomes increasingly relevant.
The present paper addresses this problem with an implementation of the orchestration language Orc as a domain specific language in Haskell. Orc was, therefore, realized as a combinator library using the lightweight threads and the communication and synchronization primitives of the Concurrent Haskell library. With this implementation it becomes possible to create orchestrations that re-use existing Haskell code and, conversely, re-use orchestrations inside other Haskell programs.
The complexity inherent to distributed computation, entails the need for the classification of efficient, re-usable, concurrent programming patterns. The paper discusses how the calculus of recursive schemes used in the derivation of functional programs, scales up to a distributed setting. It is shown, in particular, how to parallelize the entire class of binary tree hylomorphisms.

Barbosa LS.  2009.  A perspective on service orchestration. Science of Computer Programming. 74(9):671–687. Abstractscp-bb09.pdf

Service-oriented computing is an emerging paradigm with increasing impact on the way modern software systems are designed and developed. Services are autonomous, loosely coupled and heterogeneous computational entities able to cooperate to achieve common goals. This paper introduces a model for service orchestration, which combines a exogenous coordination model, with services’ interfaces annotated with behavioural patterns specified in a process algebra which is parametric on the interaction discipline. The coordination model is a variant of Reo for which a new semantic model is proposed.

Abreu R, Zoeteweij P, Golsteijn R, Van Gemund AJC.  2009.  A practical evaluation of spectrum-based fault localization. Journal of Systems and Software. 82(11):1780–1792. Abstractjss09.pdf

Spectrum-based fault localization (SFL) shortens the test-diagnose-repair cycle by reducing the debugging effort. As a light-weight automated diagnosis technique it can easily be integrated with existing testing schemes. Since SFL is based on discovering statistical coincidences between system failures and the activity of the different parts of a system, its diagnostic accuracy is inherently limited. Using a common benchmark consisting of the Siemens set and the space program, we investigate this diagnostic accuracy as a function of several parameters (such as quality and quantity of the program spectra collected during the execution of the system), some of which directly relate to test design. Our results indicate that the superior performance of a particular similarity coefficient, used to analyze the program spectra, is largely independent of test design. Furthermore, near-optimal diagnostic accuracy (exonerating over 80% of the blocks of code on average) is already obtained for low-quality error observations and limited numbers of test cases. In addition to establishing these results in the controlled environment of our benchmark set, we show that SFL can effectively be applied in the context of embedded software development in an industrial environment.

Abreu R, Van Gemund AJC.  2009.  A low-cost approximate minimal hitting set algorithm and its application to model-based diagnosis. Abstraction, Reformulation, and Approximation (SARA). Abstractsara09_submission_16.pdf

Generating minimal hitting sets of a collection of sets is known to be NP-hard, necessitating heuristic approaches to handle large problems. In this paper a low-cost, approximate minimal hitting set (MHS) algorithm, coined Staccato, is presented. Staccato uses a heuristic function, borrowed from a lightweight, statistics-based software fault localization approach, to guide the MHS search. Given the nature of the heuristic function, Staccato is specially tailored to model-based diagnosis problems (where each MHS solution is a diagnosis to the problem), although well-suited for other application domains as well. We apply Staccato in the context of model-based diagnosis and show that even for small problems our approach is orders of magnitude faster than the brute-force approach, while still
capturing all important solutions. Furthermore, due to its low cost complexity, we also show that Staccato is amenable to large problems including millions of variables.

Clarke D, Proença J, Lazovik A, Arbab F.  2009.  Deconstructing Reo. Electronic Notes in Theoretical Computer Science. 229(2):43–58. Abstractdeconstructing_reo-entcs.pdf

Coordination in Reo emerges from the composition of the behavioural constraints of the primitives, such as channels, in a component connector. Understanding and implementing Reo, however, has been challenging due to interaction of the channel metaphor, which is an inherently local notion, and the non-local nature of constraint propagation imposed by composition. In this paper, the channel metaphor takes a back seat, and we focus on the behavioural constraints imposed by the composition of primitives, and phrase the semantics of Reo as a constraint satisfaction problem. Not only does this provide a clear intensional description of the behaviour of Reo connectors in terms of synchronisation and data flow constraints, it also paves the way for new implementation techniques based on constraint propagation and satisfaction. In fact, decomposing Reo into constraints provides a new computational model for connectors, which we extend to model interaction with an unknown external world beyond what is currently possible in Reo.

Harrison M, Campos JC.  2008.  Analysing Human Aspects of Safety-Critical Software. ERCIM News. 75:18. Abstracten75-p18.pdf

In focusing on human system interactions, the challenge for software engineers is to build systems that allow users to carry out activities and achieve objectives effectively and safely. A well-designed system should also provide a better experience of use, reducing stress and frustration. Many methods aim to help designers to produce systems that have these characteristics. Our research is concerned with the use of formal techniques to help construct such interactive systems.

Silva JL, Campos JC, Paiva A.  2008.  Model-based user interface testing with Spec Explorer and ConcurTaskTrees. Electronic Notes in Theoretical Computer Science. 208:77-93. Abstract1-s2.0-s1571066108002132-main.pdf

Analytic usability analysis methods have been proposed as an alternative to user testing in early phases of development due to the cost of the latter approach. By working with models of the systems, analytic models are not capable of identifying implementation related problems that might have an impact on usability. Model-based testing enables the testing of an implemented software artefact against a model of what it should be (the oracle). In the case of model-based user interface testing, the models should be expressed at an adequate level of abstraction, adequately modelling the interaction process. This paper describes an effort to develop tool support enabling the use of task models as oracles for model-based testing of user interfaces.

Harrison M, Kray C, Campos JC.  2008.  Exploring an option space to engineer a ubiquitous computing system. Electronic Notes in Theoretical Computer Science. 208:41-55. Abstract1-s2.0-s1571066108002119-main.pdf

Engineering natural and appropriate interactive behaviour in ubiquitous computing systems presents new challenges to their developers. This paper explores formal models of interactive behaviour in ubiquitous systems. Of particular interest is the way that these models may help engineers to visualise the consequences of different designs. Design options based on a dynamic signage system (GAUDI) are explored using different instances of a generic model of the system.

Almeida JB, Pinto JS, Vilaça M.  2008.  Token-passing Nets for Functional Languages. Electronic Notes in Theoretical Computer Science. 204:181-198. Abstract07tnfp.pdf

Token-passing nets were proposed by Sinot as a simple mechanism for encoding evaluation strategies for the λ-calculus in interaction nets. This work extends token-passing nets to cover a typed functional language equipped with structured types and unrestricted recursion. The resulting interaction system is derived systematically from the chosen big-step operational semantics. Along the way, we actually characterize and discuss several design decisions of token-passing nets and extend them in order to achieve simpler interaction net systems with a higher degree of embedded parallelism.