Well, I'm too shy to say why, but as of May 2013, I find the following video really fascinating:
One of those things that I do dare to say publicly is that this video lead me to a thought that computer scientists seem to have at least 2 styles of thinking. One of them is a kind of "parallel" style, where y=f(x) and when x changes, the y is magically updated, because according to its definition, its preconditions changed. The other style of thinking is, where the f in the y=f(x) is written down as a series of steps, operations, that a pure mathematician performs on paper and a computer scientist has its computer execute for him. The operations can be even in random order, but there's still a set of operations.
Consider a set of light switches and lights in a house. For the sake of simplicity, an assumption is that there's only one light per switch. The y=f(x) would be set_of_lights_that_are_on=f(set_of_switches_that_are_at_on_position). After the f receives an update of the set_of_switches_that_are_at_on_position, the state of the lights can be updated in random order, i.e. first the kitchen light and then the bathroom light or vice versa, but the activity of dimming or activating the lights by the f can still be written down as a series of steps, like a loop over an array or a series of assembly instructions, etc.
I know that this post seems dumb, but the style of simple, elemental, things can be a source of big confusion and clearing them out helps to understand things. Hence the distinction between the "instantaneous" style of thinking versus "series of steps" style of thinking and the description of the relation between the two.