One of our most fundamental notions is our concept of self. Somehow the billions of neurons and trillions of synapses synthesize a "self" out of their connections and operations. For neuroscience this presents a difficult challenge: how do the many systems and subsystems of the brain cooperate to produce a unified concept of self? I don't know the answer, and I'm pretty sure neuroscientists don't either, but I have an analogy.
Internet computer games like Altitude need to seamlessly integrate the worlds of multiple players scattered all over the world. Each player has a copy of the "world" of the game, complete with terrain, other players, projectiles the players are firing at each other. Each player is controlling an aircraft and constantly inputting commands to speed up, slow down, fire projectiles, etc. The "worlds" are not exactly the same, since each player has a priviledged position on his own game map, but the game would fall apart if the various players worlds failed to maintain consistency. That consistency is maintained by a steady stream of messages updating each players map in response to every game event.
Our brains must do something similar when we integrate sounds, smells, and visual information with memory to construct a model of our world. These models too are constantly updated. Sometimes the two distant headlights are a truck and sometimes they are two motorcycles. Sometimes tha flashing lights behind you are a policeman and sometimes they turn out to be an alien spaceship.
Identifying an unfamiliar object by touch is a good test case. While in a cluttered environment, close your eyes and start grabbing things. Most, I found, could be identified nearly instantly - not too surprising considering that I'm in my computer room. The less familiar the environment and the objects, the more difficult the task becomes. The kinds of information we need to integrate are pretty various: size, shape, texture, relative location, whether the object is easily moved or not, its resistance to pressure.