Alexey Radul gave a talk on the "Art of Propagator" [1]. His slides are available at [2].
Here are the key points raised during the talk and during the subsequent discussion. The notes are also available at [3]. There could be mistakes in the scribe notes. Feel free to correct if you find any.
Alexey: A propagator is an abstract machine that reads some cells and writes to some cells.
Lalana: Does it allow rewriting cells? And what about the temporal aspects of the system?
Alexey: The state is explicitly exposed in the cells, and the propagator system network is stateless. It's always on, asynchronous and stateless.
Joe: Does it accept, for example, a Clcok Pulse Input?
Alexey: We are avoiding the consequece of time.
Alexey: Gave the examples of the mutually inverse propagators using the adder and subtractor example, and the multi-directional computation example useing Celcius -> Fahrenheit -> Kelvin temperature conversion.
Alexey: A cell can get multiple input. This can lead to many problems, even if the cells hold different but compatible values. Therefore, the cells hold *information* about the values, and not the values.
Ian: Information flow is decided by the constraints of the propagator network.
Alexey: Information is flexible and ideally monotonic. The constraints compose into multi-directional computations which can grow incrementally without adjusting explicit controls. Interval arithmetic provides a solution to temporal inconsistencies between the different values obtained from different sources.
Alexey: The paper was about low level propagtor networks. But the same concept can be applied to higher level abstractions such as the "Deadbeat Dad" scenario.
Lalana: Functionality wise, how does a propagator network different from a regular TMS?
Alexey: This can be thought of as a distributed TMS
Gerry: The propagator network distribute the computation across the cells.
Kenny: The cells are global?
Alexey: No
Joe: Are cells unique?
Alexey: Yes
K: There are many issues related to data management and controlling access to data. Data from multiple sources could be synchronized to compose a compound answer or the answer could be taken from a single original source.
[scribe note: the question was not that clear to me (was distracted for some reason). So, please add / correct me if I am wrong]
Gerry: Compound answer might be better.
Ian: What is the difference between Actor model[4] and this?
Gerry: Actor model is mainly for concurrency computation. But the Propagators do not implement concurrency.
Kenny: Can you give more examples for the Partial Information scenario?
Alexey: TMSes, machine learning results, Waltz's line labelling algortihm, Real number constraint satisfaction algorithms (?), Probability Distribution Networks
Lalana: How are you readings the cells? Through URIs as identifiers? Pub-Sub protocol?
Alexey: Right now they are just pointers to the computer's memory. This is still a model of computation at a very low level.
Oshani: If we use URIs as the contents of the cells, don't we have to assume the cells to be immutable?
Gerry: We can use Versioning as a proof of what the current state is.
Alexey: This is entirely a simulation of a compuatation model. No concurrency implemented yet. Only allow one propagator at a time.
Gerry: My main worry is about the garbage collection.
Kenny: What are the future plans of this project?
Alexey:
1. Partial Information over Compound data structures
2. Abstraction - Put a boundary around the propagotor network
3. Probablisitic model of data structure for interencing - How can we do continuous instead discrete?
[1] http://dspace.mit.edu/handle/1721.1/44215
[2] http://dig.csail.mit.edu/2009/DIG_Seminar/0312-alexey.pdf
[3] http://dig.csail.mit.edu/2009/DIG_Seminar/notes/0312.xhtml
[4] http://en.wikipedia.org/wiki/Actor_model