Recent Publications

Frade MJ, Saabas A, Uustalu T.  2007.  Foundational certification of data-flow analyses. 1st IEEE/IFIP Symposium on Theoretical Aspects of Software Engineering - TASE . :107-116. Abstracttase-final.pdf

Data-flow analyses, such as live variables analysis, available expressions analysis etc., are usefully specifiable as type systems. These are sound and, in the case of distributive analysis frameworks, complete wrt. appropriate natural semantics on abstract properties. Applications include certification of analyses and "optimization" of functional correctness proofs alongside programs.

On the example of live variables analysis, we show that analysis type systems are applied versions of more foundational Hoare logics describing either the same abstract property semantics as the type system (liveness states) or a more concrete natural semantics on transition traces of a suitable kind (future defs and uses). The rules of the type system are derivable in the Hoare logic for the abstract property semantics and those in turn in the Hoare logic for the transition trace semantics. This reduction of the burden of trusting the certification vehicle can be compared to foundational proof-carrying code, where general-purpose program logics are preferred to special-purpose type systems and universal logic to program logics.

We also look at conditional liveness analysis to see that the same foundational development is also possible for conditional data-flow analyses proceeding from type systems for combined “standard state and abstract property” semantics.

Fernandes JP, Pardo A, Saraiva JA.  2007.  A shortcut fusion rule for circular program calculation. Haskell Workshop. :95-106. Abstracthw07.pdf

Circular programs are a powerful technique to express multiple traversal algorithms as a single traversal function in a lazy setting. In this paper, we present a shortcut deforestation technique to calculate circular programs. The technique we propose takes as input the composition of two functions, such that the first builds an intermediate structure and some additional context information which are then processed by the second one, to produce the final result. Our transformation into circular programs achieves intermediate structure deforestation and multiple traversal elimination. Furthermore, the calculated programs preserve the termination properties of the original ones.

Fernandes JP, Saraiva JA.  2007.  Tools and libraries to model and manipulate circular programs. Workshop on Partial Evaluation and Program Manipulation. :102-111. Abstract0f31752e977829fb98000000.pdf

This paper presents techniques to model circular lazy programs in a strict, purely functional setting. Circular lazy programs model any algorithm based on multiple traversals over a recursive data structure as a single traversal function. Such elegant and concise circular programs are defined in a (strict or lazy) functional language and they are transformed into efficient strict and deforested, multiple traversal programs by using attribute grammars-based techniques. Moreover, we use standard slicing techniques to slice such circular lazy programs. We have expressed these transformations as an Haskell library and two tools have been constructed: the HaCirctool that refactors Haskell lazy circular programs into strict ones, and the OCirctool that extends Ocaml with circular definitions allowing programmers to write circular programs in Ocaml notation, which are transformed into strict Ocaml programs before they are executed. The first benchmarks of the different implementations are presented and show that for algorithms relying on a large number of traversals the resulting strict, deforested programs are more efficient than the lazy ones, both in terms of runtime and memory consumption.

Silva A, Rutten J.  2007.  Behavioural differential equations and coinduction for binary trees. Proceedings of the Workshop on Logic, Language, Information and Computation 2007 (WoLLIC 2007). 4576:322–336. Abstractwollic2007.pdf

We study the set TA of infinite binary trees with nodes labelled in a semiring A from a coalgebraic perspective. We present coinductive definition and proof principles based on the fact that TA carries a final coalgebra structure. By viewing trees as formal power series, we develop a calculus where definitions are presented as behavioural differential equations. We present a general format for these equations that guarantees the existence and uniqueness of solutions. Although technically not very difficult, the resulting framework has surprisingly nice applications, which is illustrated by various concrete examples.

Abreu R, Zoeteweij P, Van Gemund AJC.  2007.  On the accuracy of spectrum-based fault localization. Testing: Academic and Industrial Conference Practice and Research Techniques-Mutation - TAICPART-Mutation . :89–98. Abstractazg_taicpart07.pdf

Spectrum-based fault localization shortens the test- diagnose-repair cycle by reducing the debugging effort. As a light-weight automated diagnosis technique it can easily be integrated with existing testing schemes. However, as no model of the system is taken into account, its diagnostic accuracy is inherently limited. Using the Siemens Set benchmark, we investigate this diagnostic accuracy as a function of several parameters (such as quality and quantity of the program spectra collected during the execution of the system), some of which directly relate to test design. Our results indicate that the superior performance of a particular similarity coefficient, used to analyze the program spectra, is largely independent of test design. Furthermore, near- optimal diagnostic accuracy (exonerating about 80% of the blocks of code on average) is already obtained for low-quality error observations and limited numbers of test cases. The influence of the number of test cases is of primary importance for continuous (embedded) processing applications, where only limited observation horizons can be maintained.

Zoeteweij P, Abreu R, Golsteijn R, Van Gemund AJC.  2007.  Diagnosis of embedded software using program spectra. 14th Annual IEEE International Conference and Workshops on the Engineering of Computer-Based Systems - ECBS. :213–220. Abstractecbs07.pdf

Automated diagnosis of errors detected during software testing can improve the efficiency of the debugging process, and can thus help to make software more reliable. In this paper we discuss the application of a specific automated debugging technique, namely software fault localization through the analysis of program spectra, in the area of embedded software in high-volume consumer electronics products. We discuss why the technique is particularly well suited for this application domain, and through experiments on an industrial test case we demonstrate that it can lead to highly accurate diagnoses of realistic errors.

Debnath N, Garis A, Riesco D, Montejano G.  2007.  Defining OCL constraints for the Proxy Design Pattern Profile. Computer Systems and Applications, 2007. AICCSA '07. IEEE/ACS International Conference on. 1-4244-1030-4:880-885. Abstractacs07.pdf

Profiles allow extend UML vocabulary and the design patterns define a common vocabulary for software designers, therefore it is possible to use profile to define a pattern vocabulary in UML. If profiles are used to represent patterns then it is not required to define a special notation neither a particular CASE tool for patterns (UML tool is used). Three mechanisms are in the profiles: stereotypes, tag values and constraints. Stereotypes extend the UML vocabulary and it is possible to associate to it tag values and constraints. When these elements are introduced in models; patterns can be clearly visualized, software developers improve communication and establish a common vocabulary. Also profiles allow add information to the model to transform it to other models. OCL (Object Constraint Language) constraints are semantic restrictions added to UML elements. This work shows a way in which OCL constraints are used to define semantic restrictions over stereotypes included in a profile of pattern. The definition of OCL constraints for proxy design pattern is shown as an example of our proposal. An interaction between users and UML tool is proposed for solving the loss generality when OCL constrains are imposed.

Avvenuti M, Corsini P, Masci P, Vecchio A.  2007.  Opportunistic computing for wireless sensor networks. Mobile Adhoc and Sensor Systems, 2007. MASS 2007. IEEE Internatonal Conference on. :1–6. Abstract

Wireless sensor networks are moving from academia to real world scenarios. This will involve, in the near future, the design and production of hardware platforms characterized by low-cost and small form factor. As a consequence, the amount of resources available on a single node, i.e. computing power, storage, and energy, will be even more constrained than today. This paper faces the problem of storing and executing an application that exceeds the memory resources available on a single node. The proposed solution is based on the idea of partitioning the application code into a number of opportunistically cooperating modules. Each node contributes to the execution of the original application by running a subset of the application tasks and providing service to the neighboring nodes.

Ferreira F, Pacheco H.  2007.  {XPTO} - An {Xpath} Preprocessor with Type-aware Optimization. Proceedings of the Conference on Compilers, Related Technologies and Applications (CORTA 2007). Abstract
n/a
Bento M, Preguiça N, Moreno CB, Martins JL.  2006.  Reconciliation for mobile computing environments with portable storage devices. Actas da Conferência sobre Sistemas Móveis e Ubíquos . Abstractbento-131_1.pdf

Mobile computing environments have changed in recent years with the increasing use of different types of portable devices, ranging from mobile phones to laptops, and from MP3 players to portable storage devices (e.g. flash disks). Many of these devices have large amounts of storage, allowing users to transport most of their data with them. In this paper we briefly present the FEW file management system, a system that aims to ease file management in this new mobile environment. In particular, we detail the automatic reconciliation approach used in this system based on operational transformation. We motivate our work with a study of conflicts in data managed by version control systems.

Campos JC, Doherty G.  2006.  Supporting resource-based analysis of task information needs. Interactive Systems: Design, Specification and Verification - DSV-IS. 3941:188-200. Abstractcamposdoherty-final.pdf

We investigate here an approach to modelling the dynamic information requirements of a user performing a number of tasks, addressing both the provision and representation of information, viewing the information as being distributed across a set of resources. From knowledge of available resources at the user interface, and task information needs we can identify whether the system provides the user with adequate support for task execution. We look at how we can use tools to help reason about these issues, and illustrate their use through an example.We also consider a full range of analyses suggested using this approach which could potentially be supported by automated reasoning systems.

Campos JC, Ribeiro AN.  2006.  UML no Desenvolvimento de Sistemas Interactivos. Interação 2006 - Actas da 2ª Conferência Nacional em Interacão Pessoa-Máquina. :77-80. Abstract31_anr.pdf

Os processos típicos de análise e modelação de sistemas numa perspectiva de engenharia de software atribuem pouca importância à modelação da camada interactiva. O Unified Process e a linguagem que o suporta (UML) não são excepção. Este artigo propõe uma abordagem, que procura integrar a modelação da camada interactiva no processo de modelação típico baseado em UML, explorando as potencialidades fornecidas pela linguagem.

Mano A, Campos JC.  2006.  Cognitive walkthroughs in the evaluation of user interfaces for children. Interação 2006 - Actas da 2a. Conferência Nacional em Interação Pessoa-Máquina. :195-198. Abstract7-prov_jose_campos_1.pdf

This paper describes a case-study, dealing with the application of the cognitive walkthrough as a meth od of evaluating an interface built for children. We perf ormed the walkthrough and tested the interface with children aged between 5 and 7 years old. Given our goals and the scope of this study, the cognitive walkthrough proved as a reliable source of indications about usability problems on an interface aimed at children.

Rodrigues S, Campos JC, Ribeiro AN.  2006.  Adaptação nativa de interfaces com o utilizador em dispositivos móveis. CSMU 2006 - Conference on Mobile and Ubiquitous Systems. :171-174. Abstractpapercsmu_anr.pdf

Neste artigo apresentamos o trabalho que tem vindo a decorrer no desenvolvimento de uma framework que permita a adaptação de interfaces com o utilizador directamente no ambiente nativo dos dispositivos, fornecendo assim ao utilizador interfaces de aplicações adequadas ao dispositivo, e ao estilo de interacção a que o utilizador está habituado.

Silva JC, Campos JC, Saraiva JA.  2006.  Models for the Reverse Engineering of Java/Swing Applications. 3rd International Workshop on Metamodels, Schemas, Grammars, and Ontologies for Reverse Engineering. Abstractjcsilva06_jas.pdf

Interest in design and development of graphical user interface (GUIs) is growing in the last few years. However, correctness of GUI's code is essential to the correct execution of the overall software. Models can help in the evaluation of interactive applications by allowing designers to concentrate on its more important aspects. This paper describes our approach to reverse engineering abstract GUI models-based directly from the Java/Swing legacy code.