Harvard Computer Science Technical Reports for 2010

TR-01-10 [tr-01-10.pdf (364 K)]
William Yuen , Paul Syverson, Zhenming Liu, and Christopher Thorpe . 2010. ``Intention-Disguised Algorithmic Trading .''
We propose a general model underlying the problem of designing trading strategies that leak no information to frontrunners and other exploiters. We study major scenarios in the market and design a family of algorithms that can be proven to leak no information in important scenarios. These algorithms can serve as building blocks for more challenging real-world scenarios beyond our current scope. In contrast to previous work, the strategies we propose protect trader using the existing trading infrastructure.
TR-02-10 [tr-02-10.pdf (611 K)]
Feng Zhu and Michael Mitzenmacher. 2010. ``The Evolution of Two-Sided Markets: A Dynamic Model.''
Static models of two-sided markets often lead to multiple equilibria as a result of increasing return to demand. Typically the market can be monopolistic or oligopolistic in equilibrium. We provide a simple dynamic model to address the equilibrium selection problem. Our results indicate that the final market structure depends criticallyon the overall strength of the indirect network effects of the two sides. Interestingly,we find under weak network effects the market share advantage of the leader diminishesover time. Simulating our model under the monopoly conditions also suggeststhat time until dominance follows a power law distribution, and as a result the probability that monopoly does not occur for an extended period is not trivial. We then discuss the implications of our results for intermediaries.
TR-03-10
Hilary Finucane and Michael Mitzenmacher. 2010. ``An Improved Analysis of the Lossy Difference Aggregator.''

TR-04-10 [tr-04-10.pdf (399 K)]
Stephen Chong. 2010. ``Required Information Release.''
Many computer systems have a functional requirement to release information. Such requirements are an important part of a system's information security requirements. Current information-flow control techniques are able to reason about permitted information flows, but not required information flows.

In this paper, we introduce and explore the specification and enforcement of required information release in a language-based setting. We define semantic security conditions that express both what information a program is required to release, and how an observer is able to learn this information. We also consider the relationship between permitted and required information release, and define bounded release, which provides upper- and lowerbounds on the information a program releases. We show that both required information release and bounded release can be enforced using a security-type system.

TR-05-10 [tr-05-10.pdf (379 K)]
Gregory Malecha and Stephen Chong. 2010. ``A More Precise Security Type System for Dynamic Security Tests.''
The move toward publically available services that store private information has increased the importance of tracking information flow in applications. For example, network systems that store credit-card transactions and medical records must be assured to maintain the confidentiality and integrity of this information. One way to ensure this is to use a language that supports static reasoning about information flow in the type system. While useful in practice, current type systems for checking information flow are imprecise, unnecessarily rejecting safe programs. This annoys programmers and often results in increased code complexity in order to work around these artificial limitations. In this work, we present a new type system for statically checking information flow properties of imperative programs with exceptions. Our key insight is to propagate a context of exception handlers and check exceptions at the throw point rather than propagating exceptions outward and checking them at the catch sites. We prove that our type system guarantees the standard non-interference condition and that it is strictly more permissive than the existing type system for Jif, a language that extends the Java type system to reason about information flow.
TR-06-10 [tr-06-10.pdf (149 K)]
Fabiano Romeiro and Todd Zickler . 2010. ``Ensemble Learning for Reectometry.''
In "Blind Reectometry" (Romeiro and Zickler, 2010 [3]) we describe a variational Bayesian approach to inferring material properties (BRDF) from a single image of a known shape under unknown, real- world illumination. This technical report provides additional details of that approach. First, we detail the prior probability distribution for natu- ral lighting environments. Second, we provide a derivation of the bilinear likelihood expression that is based on discretizing the rendering equation. Third and nally, we provide the update equations for the iterative algo- rithm that computes an approximation to the posterior distribution of BRDFs.
TR-07-10 [tr-07-10.pdf (1542 K)]
Nicholas R. Hoff III, Amelia Sagoff, Robert J. Wood, and Radhika Nagpal. 2010. ``Two Foraging Algorithms for Robot Swarms Using Only Local Communication.''
Large collections of robots have the potential to perform tasks collectively using distributed control algorithms. These algorithms require communication between robots to allow the robots to coordinate their behavior and act as a collective. In this paper we describe two algorithms which allow coordination between robots, but do not require physical environment marks such as pheromones. Instead, these algorithms rely on simple, local, lowbandwidth, direct communication between robots. We describe the algorithms and measure their performance in worlds with and without obstacles.
TR-09-10 [tr-09-10.pdf (1542 K)]
Ayan Chakrabarti, Keigo Hirakawa, and Todd Zickler. 2010. ``Computational Color Constancy with Spatial Correlations.''
The color of a scene recorded by a trichromatic sensor varies with the spectral distribution of the illuminant. For recognition and many other applications, we seek to process these measurements to obtain a color representation that is unaffected by illumination changes. Achieving such color constancy is an ill-posed problem because both the spectral distribution of the illuminant and the scene reflectance are unknown. For the most part, methods have approached this problem by leveraging the statistics of individual pixel measurements, independent from their spatial contexts. In this work, we show that the strong spatial correlations that exist between measurements at neighboring image points encode useful information about the illuminant and should not be ignored. We develop a method to encode these correlations in a statistical model and exploit them for color constancy. The method is computationally efficient, allows for the incorporation of prior information about natural illuminants, and performs well when evaluated on a large database of natural images.
TR-10-10 [tr-10-10.pdf (9420 K)]
Fabiano Romeiro and Todd Zickler. 2010. ``Inferring Reflectance under Real-world Illumination.''
We address the problem of inferring homogeneous reflectance (BRDF) from a single image of a known shape in an unknown real-world lighting envi- ronment. With appropriate representations of lighting and re ectance, the image provides bilinear constraints on the two signals, and our task is to blindly isolate the latter. We achieve this by leveraging the statistics of real-world illumination and estimating the re ectance that is most likely under a distribution of probable illumination environments. Experimental results sug- gest that useable re ectance information can be often be inferred, and that these estimates are stable under changes in lighting.