I'm looking to adapt the PageRank algorithm as a centrality measure in a network. This network however, unlike the "random surfer" of the original paper on PageRank, or the random library browser for Eigenfactor.org, doesn't have a random browser who can leave and jump off to some other network. The theoretical reader is reading only this literature, and reading it completely.
As I understand it, the damping factor in the usual implementation of PageRank is 1 - probability of the random surfer jumping to a different site, and is usually set at 0.85. Is it reasonable then, in an entirely closed network, to set this value = 1.0, or is there something I'm not seeing?
Some details of the network, which would probably be helpful:
All the networks I will be looking at are fairly small, less than 1000 nodes, and directed. They're citation networks - with papers as nodes and edges as citations between papers, so inherently there are no isolated nodes not connected to any other nodes, as their inclusion in the network is conditional on there being a link to or from the network. There's no reason to believe the network is strongly connected - indeed, I'm pretty sure they're inherently not.
I'm not familiar at all with details of the PageRank theory, but here's an intuitive answer: Suppose you have a huge connected graph plus a single isolated vertex that you wish to reach. Without random jumps there's no hope to stop surfing. Does the algorithm exclude such bad instances? More generally if the graph is disconnected the jumps would be necessary to reach every vertex.