since January 2014
March 2012 to August 2013
Member of a group tasked with stimulating and reinvigorating research profile of the School of Computing and Engineering (in support of the Research Excellence Framework 2014).
Two primary outputs of work here:
- building a BAP/Volatility interface to aid low level OS teaching and enhance static analysis capabilities of Volatility – code and writeup available at libbap
- implementing a DSL for building, configuring and deploying workflows for automated assessment feedback (work done in collaboration with the High Performance Computing group) – code and writeup available at cloud-paper
October 2008 to February 2012
Higher level web programming (e.g. Ruby on Rails and PHP/MySQL), server/network administration and security (with custom OS debuggers and network scanners built in Python).
Higher and masters level reverse engineering (e.g. using PaiMei, BinNavi and IDA Pro), ethical hacking and computer forensics (e.g. using Autopsy, FTK, EnCase and Volatility).
Masters level advanced OO programming with JML based assertional programming, reflection techniques, and aspect-oriented programming.
Introductory and higher level C++ programming and design modules.
August 2004 to February 2012
Part-time Software Engineer
Part-time self-employed Software Engineer, specializing in:
- the application
and usage of open-source software to build secure and high-reliability
October 2007 to October 2008
School of Computing and Engineering, University of Huddersﬁeld
Part-time Senior Lecturer
Responsible for teaching PHP/MySQL web programming and advanced
object-oriented programming techniques such as: JML based assertional
programming; reﬂection techniques; and aspect-oriented Programming.
October 2006 to October 2007
School of Computing and Engineering, University of Huddersﬁeld
Part-time Senior Lecturer
Responsible for teaching of introductory and advanced C++ programming
and design modules.
September 1997 to August 2004
School of Computing and Engineering, University of Huddersfield
Responsible for the teaching of introductory and advanced Java programming and program design modules to all levels of degree students.
Courses taught using software (written by myself) that automatically analyses (via a mixture of JUnit test suites and ESC/Java driven program analysis) and generates formative feedback (via XML and XSLT) on student programming solutions every 24 hours. In doing this, student programming and design skills significantly improved.
Other responsibilities include the development and validation of new courses (eg. BSc(Hons) Secure and Forensic Computing degree program and Declarative Architectures), supervision of PhD and MSc research students and a lead role within the departmental and school admission teams (responsible for: the development of recruitment and assessment strategies; designing, organising and developing laboratory sessions for applicants).
Mathew Ford – PhD applying compilation techniques to the analysis of railway signaling systems
Flavia Mongini – MSc applying category theory to formal concept analysis and decomposition theory
November 1995 to September 1997
Department of Computing, University of Warwick
Full-time Research Fellow
Employed on an EPSRC funded project investigating the validation of program implementations. The project made extensive use of automatic proof checkers (eg. PVS and Isabelle) to analyse and verify properties of annotated C object code.
I designed and implemented (in Isabelle HOL/SML) a polymorphic implementation of the pi-calculus and an associated refinement calculus. This implementation was used by myself to model the semantics of object code with the refinement calculus being used to abstract inessential details from the object code.
Program specific information, for use in the object program’s verification, was obtained by exploiting various static analysis techniques (eg. control and data flow analyses, abstract interpretation) and compiler hints (obtained by annotating the object code at compile time).
Computer Arithmetic: Logic, Calculation and Rewriting
by M.Benini, D.Nowotka and C.Pulley. Frontiers of Combining Systems (FroCoS’98), October 2nd-4th 1998, University of Amsterdam
This paper describes the concepts behind the Computer Arithmetic Toolkit (CAT) for the Isabelle/HOL system written by the authors.
Holly: An Approach to Object Code Verification
by M. Benini, S. Kalvala, D. Nowotka and C. Pulley. 5th Annual Automated Reasoning Workshop (ARW 5) Bridging the Gap between Theory and Practice co-located with the 14th BCTCS, St. Andrews, 1998.
Formal verification of software typically consists in checking programs in a high-level language against an abstract specification. Confidence on the correctness of the actual program running on a machine also relies on the soundness of the translation from the high-level language to the machine code, i.e., the correctness of a compiler, including its code optimisation techniques.
If one wishes to have confidence on the final code, another approach is object code verification, i.e., verification of program code at assembly level. Now the confidence in the implementation of a program on an actual machine is more direct, relying on the correctness of the hardware model and no other software.
But, dealing with object code also has a price. It is, in a sense, further away from its specification because it has to deal with a more concrete machine, considering registers, limited memory, and so on. There is a vast shift in granularity between code and specification.
The technique for object code verification we are developing emphasises the use of abstraction, aided by a process algebraic framework. The concept of simulation provides the tool with which abstractions on the process algebraic representation of the program can be formalised.
However, the correctness proof takes place in higher-order logic (HOL), which we believe presents a more amenable proof environment. Abstraction at the logical level is obtained by unlifting results from the CCS level using the CCS semantics. More precisely, let < be a simulation relation, and let [| – |]^s be the lifting operator for a particular sequence s of states, then:
If P < Q then ∀ s · [| P |]^s ⇒ [| Q |]^s .
The interaction between the logical and the algebraic level is fundamental for our methodology.
We have implemented this approach in a proof system we call Holly. We use Isabelle/HOL, in which we embedded a variation of Milner’s Calculus of Communicating Systems (CCS) as a process algebra. When verifying a program, its object code is translated both into a representation in HOL and an equivalent representation in CCS. At any point in a proof either deductions in HOL or simulations in the process algebra can be used.
Holly demonstrates the concurrent use of a logical and an algebraic model to make object code verification feasible. In this talk, we expand on the interaction between these models, and highlight the interaction within an example proof in Holly.
CAT is a Computer Arithmetic Toolkit for the Isabelle/HOL theorem prover. It provides a theory for integers, a rewrite engine for integer expressions, and a decision procedure for Presburger formulas.
Version 1.0 beta of CAT is available here which requires Isabelle94-8 and Poly ML 3.2.
- Source code distribution: CAT.tgz
- Documentation (also included in the distribution):
May 1992 to May 1995
Department of Computation, University of Manchester Institute for Science and Technology
Full-time Research Associate
Employed under the SERC/DTI Advanced Technology Program Safety Critical Systems Initiative (project nos. IED4/1/9035). This was a collaborative project between UMIST, GEC Alsthom Signalling Limited and Westinghouse Signals Limited into the formal analysis and verification of the Solid State Interlocking (SSI) data language (used to specify the functionality of railway signalling interlocking installations).
Inference procedures for verifying behavioural properties of an SSI were specified in Z and implemented in Prolog++.
EPSRC reported that this project had made a very significant contribution to its field.
Paul Dongha – PhD using resource bounded agents to build formal micro-economic models with temporal/action logics
Logical Methods in the Formal Verfication of Safety-Critical Software
by C.Pulley and G.V.Conroy. In the IMA Conference Proceedings: Mathematics of
Dependable Systems, 1st – 3rd September 1993, Royal Holloway, University of London
The purpose of the work reported in this paper is to describe a formal procedure by which one may automatically verify the safety of a specific SSI interlocking. Unlike other research work reported on elsewhere (see eg. [Mor93] and [IngMit92]), we shall explicitly model the IFPs polling cycle within the formalism to be presented shortly. This has the added benefit of enabling us to more correctly model the SSIs behavior and to also formally verify the safety of the IFP. Unlike [IngMit92], we are able to express and verify much more general safety properties. Our work also differs from the above in that we have concentrated on trying to improve the efficiency of our decision procedure rather than by concentrating on the modes by which we may verify certain types of safety property.
- [Mor93]: Morley, M., J. (1991) Modeling British Rail’s Interlocking Logic: Geographic Data Correctness. LFCS Report ECS-LFCS-91-186, University of Edingburgh, November 1991.
- [IngMit92]: Ingleby, M. and Mitchell, I. (1992) Proving Safety of a Railway Signaling System Incorporating Geographic Data. Proceedings SAFECOMP ’92, 28th-30th October 1992, Zurich, Switzerland.
Overview of the Proof Verifier
by C.Pulley. DTI/SERC “SSI Tools” Project, 1995, Nos.7.6
This document defines (or rather shows how to!) a Floyd-Hoare programming logic for the SSI Data Language. This programming logic may easily (tediously?) be implemented using techniques from type checking.
Specification of the Proof Assistant
by C.Pulley. DTI/SERC “SSI Tools” Project, 1995, Nos.7.4
This report provides a full specification in Z of the proof search algorithm.
The reachability algorithm specified in this report has been structured in a manner similar to the structuring of the various graph algorithms found in the works of Hopcroft and Tarjan. This has enabled a much more generic and flexible approach to the algorithms specification to be taken.
- Hopcroft, J.E. and Tarjan, R.E. (1973) Algorithm 447: Efficient Algorithms for Graph Maninpulation, Communications of the ACM 16(6), 372-378.
- Tarjan, R.E. (1972) Depth-First Search and Linear Graph Algorithms, SIAM Journal of Computing 1(2), 146-160.
A Proof Calculus for the SSI Geographic Data Language
by C.Pulley. DTI/SERC “SSI Tools” Project, 1995, Nos.7.2
Efficient Inference Techniques for Verifying that an SSI is Safe/Unsafe
by C.Pulley. DTI/SERC “SSI Tools” Project, 1995, Nos.7.1
Description of the Concrete and Abstract Syntax for the SSI Geographic Data Language
by C.Pulley. DTI/SERC “SSI Tools” Project, 1994, Nos.6.1.2
This document is essentially three separate documents all combined into one:
- Firstly, we have an EBNF description of the concrete syntax for the SSI Data Language.
- This EBNF description is presented alongside with a Z description of the abstract syntax of the SSI Data Language.
- Finally, in appendix A, we have a prolog implementation of a parser for the SSI Data Language. The parser is written using the DCG grammar rules of prolog and the rules pass back a prolog implementation of the Z abstract syntax tree, having previously taken in a prolog implementation of the EBNF description of the concrete syntax tree.
Partial Semantics for the SSI Data Language
by C.Pulley. DTI/SERC “SSI Tools” Project, 1993, Nos.6.1.1
This document, along with deliverables 6.1.2 and 6.1.3, provides a full syntactic and semantic specification (in the Z notation and a BNF like notation) of the SSI data language and the IFP.
Deliverable 6.1.3 provides a specification of the IFP and how it interacts with and uses information within the SSI data files.
This document describes a semantics for the various constructs used within the SSI data files to describe an interlockings functionality.
Semantics for the Interlocking Functional Program
by C.Pulley. DTI/SERC “SSI Tools” Project, 1993, Nos.6.1.3
This document, along with deliverables 6.1.1 and 6.1.2, provides a full syntactic and semantic specification (in the Z notation and a BNF like notation) of the SSI data language and the IFP.
Deliverable 6.1.1 provides a specification of the SSI data language constructs that are used within the various SSI data files to specify an interlockings functionality.
This document describes how the IFP interacts with and uses the information within these SSI data files. The concepts of major and minor cycle are introduced to facilitate the description of the IFP.
Approaches for Automated Reasoning with Solid-State Interlockings and their Geographic Data
by C.Pulley. DTI/SERC “SSI Tools” Project, 1992, Nos.5.1
This report introduces some logics that we suggest are of use in the formal specification of existing SSI geographic data. We then go on to show that validity for these logics is decidable.
For us this opens up the possibility of being able to automatically verify that properties hold of our SSI geographic data (in particular, we are here interested in the safety properties of a given railway layout).
The logics we propose to use are based on the second-order monadic theory of successor arithmetic.
Analysis of the use of Galois Theory to reduce Time/Space Complexities
by C.Pulley. DTI/SERC “SSI Tools” Project, 1994, Nos. U013
The purpose of this technical note is to derive an intuitive representation for the clopen subsets of the topologies defined by Ingleby and Mitchell.
In doing this, we are able to show that the galois theory approach of Ingleby and Mitchell simply partitions the SSI data code up into control fragments that have no or little dependance upon each other. Unfortunately, the majority of railway layouts have lots of interactions and so their partitioning algorithm is of little help with the verification of such layouts.
- Ingleby, M. and Mitchell, I. (1992) Proving Safety of a Railway Signaling System Incorporating Geographic Data. Proceedings SAFECOMP ’92, 28th-30th October 1992, Zurich, Switzerland.
August 1986 to October 1988
GEC and Plessey Telecommunications Limited
Key team member responsible for specifying and designing a new storage management process for the System X operating system. Specification was in VDM (with rely and guarantee conditions to handle concurrency), and the implementation was in (BT) Coral and POPUS2 assembler.
The techniques were successfully applied to deliver a high reliability product. Central to this was the utilization of my mathematical skills in proving coexistence arguments for this concurrent operating system process.
Reference: Using VDM with Rely and Guarantee-Conditions – Experiences from a Real Project by J.Woodcock and B.Dickinson. VDM Europe (1988).